CDE in #Ubuntu 16.10


CDE (Common Desktop Environment) is an old-school desktop environment for UNIX.

My first contact with UNIX was in 1992, where I was studying a Masters in Industrial Mathematics jointly with Stratchclyde University and GCU (Glasgow Caledonian University). The latter university was about to change its name from Glasgow Polytechnic at the time.

Stratchclyde University ran Sun workstations, whilst GCU ran Windows. Needless to say, I preferred the Strathclyde computers, as it was my opinion that Windows machines didn’t really “do” anything. GCU did have quite a big UNIX machine located in the library. You could use a terminal there, but it was inconvenient. I think I only used it once.

One of our GCU lecturers made us do some FORTRAN programming as part of the course, which we did by connecting to the library’s computer via Windows. There was a simple terminal, so to type in programs we used “ed”.

Ed was, needless to say, an utter pain to work with. God forbid that you would make a typo, because then what would you do?

The upshot of which was that it was always easier to use the UNIX machines at Strathclyde where possible. The UNIX machines had a proper range of software, including such useful tools as Mathematica. Actually, it might not have been Mathematica, it was one like that. Not Matlab. Would anyone care to jog my memory as to what the program might have been called?

The UNIX machines had nice big screens. The file servers did go down a few times. One time, our lecturer on Time Series and Forecasting – who was actually a bit of a shirty geezer who was probably unsuitable for lecturing – told us to overwrite our .bashrc file so that we could get a relevant program working.

Rather naively, many did this, only to discover that many other settings that we needed were obviously zapped in the process. Clearly, many of us did not have much of a concept as to what was going on, and there was much tutting at the mess he made.

At the time, I didn’t even understand the concept of directory structures, and I thought they were too complicated. You have to remember that this was the early 90’s, so there was little exposure to how computers really worked. I had a mathematical background, not a computing one.

I do not really remember what the UNIX DE (Desktop Environment). Was it CDE, I’m not sure. According to Wikipedia, CDE was released in June 1993, which would have been towards the end of my course. It must have been fairly similar, though.

During one class, some wag sent a picture of a scantily-clad young lady to my workstation, much to my embarassment. It was simply a matter of setting DISPLAY to a different IP address, and opening up a display viewer. Ah, the days of innocence, when nobody worried about security issues. It’s also interesting to note that even in the early 90’s, there were pics of scantily clad young ladies floating about. I guess it must have been like a caveman from a technologically primitive tribe hearing about paint for the first time, and upon being invited to view his first cave gallery, discovering that they were mostly pictures of scantily clad beauties.

After finishing my Masters, I then went on to study for a PhD in mathematics at GCU. We were given individual PCs to work on. I had a Mathematica (?) licence. I also had Fortran. I used both extensively. I think the PCs were 386’s, maybe 486’s, so presumably we were on 32-bit machines, rather than 16-bit.

I seem to recall that Phar Lap was involved, which was a DOS extender that allowed my Fortran compiler to work. Windows 3.1 was the main OS. The other lads used LaTeX to prepare their reports. I used it for awhile, but my professor convinced me to use Word.

Word was frustrating, because I obtained different output depending on where I printed my report. It screwed up my nice layout. It took me awhile to figure out what was going on. Word is device-dependent, so output could vary according to the device driver. LaTeX is device-independent, so output could be depended upon to be consistent. I love the look of LaTeX documents, the typesetting almost look like works of art to me.

The group that I was working in had a Dec Alpha, which cost £20k. A lot of money! I had free access to it, but it was somewhat inconvenient to use, so I just used to stick with my PC.

In retrospect, the Dec Alpha was ridiculously overpriced. We were each issued with out own PCs. The Dec cost far more than the cost of the PCs combined. Is it any wonder that the days of the workstation were numbered?

Sometime into my PhD, the head of department ordered half a dozen SGI machines for use by the faculty. The interface was quite spiffy, and they had webcams. How high-tech was that! By today’s standards, the webcams were of poor quality.

As an aside, does anyone remember the film Lost in Space, released 1999? It featured an in-film advert by SGI, right about the time it was headed for financial disaster.

In the mid 90’s I switched from an Amiga to a PC at home. The Amiga did have a Fortran compiler, believe it or not. Not bundled, of course, it was freeware. The program was called BCD. I don’t think it was bug-free, though. I did most of my programming using Blitz Basic. My simulations did, of course, finish faster on the university PC compared to my Amiga.

My PC came with Windows 95. The film Forrest Gump came bundled with it on a separate CD. It’s quite impressive how much was achieved with such lowly-specced machines.

Linux was beginning its ascendancy, and I installed Slackware off a CD that came with a book. The book seemed mostly cobbled-together, I thought. Making things work was a tricky affair, as  you needed to know a fair bit about your hardware to get the drivers for X-Windows working.

Throughout the middle of the 90’s, I did not have much exposure to UNIX machines, despite the impression you might have. I tinkered with it a bit, but most of my work was really with PCs. I did visit one lecturer at Strathclyde who was very enthusiastic with his Linux setup, and was keen to evangelise it to me. He showed me how he typed his LaTeX document in emacs. Emacs was able to compile the documents via external programs. It was magical how it was all combined. He was thrilled with it, and I was very impressed.

In the late 90’s I went to work for Logica. It was there that I became properly reacquainted with UNIX. I worked in Space and Defence. So, of course, we used real machines, none of that foo-foo nonsense that the business department used.

We used Sun Ultras. Again, I can’t remember what the Desktop Environment was. I did like it, though. I was writing C code. In Emacs, naturally. It was likely that I was using the CDE. This would have been about the time that GNOME was in its ascendancy, and there were signs that Sun were going to adopt it as their main desktop environment.

The Suns I used almost certainly did not run GNOME, though.

I remember being very impressed with the Sun package manager. Put in the relevant CD, and an installed package was just moments away.

I also liked the way you could boot Sun work stations. It had a little EPROM in it, so you could save setting that would take effect when you next rebooted.

If we could only boot computers the way we boot Suns! Multi-booting operating systems on today’s PCs seems more hard work than is strictly necessary.

We paid Sun about £20k pa for tech support, running to 5 machines. My project used tech support a lot. One tech that I ‘phoned expressed astonishment at the number of calls we raised.

We had a few site visits to replace hardware that was going down the pan. Some of it was getting a bit old, and technologically obsolete. It didn’t help, either, that much of the equipment was housed in a safe for mandatory security reasons. It got a bit hot in there.

Our project had to support some old devices, Lord alone knows why. Some were far from elegant. There was an optical drive that could store around the same amount as a CD. Considering the sizes of CD drives these days, the hardware seems unnecessarily big. Maybe the hardware was a little more reliable, but that’s debatable.

When I left that particular project, I went on to work on a Java project. At the time, Dec Alpha was looking like it was going bankrupt. That was a big scare, because it meant that an array of VAX machines that controlled satellites would be unsupported.

That’s not the kind of thing you want to hear about.

That particular project was to convert Fortran written with Dec-specific extensions into vanilla Fortran. We did this using a converter that we wrote in Java. We wrote the  software on PCs. The translation went on overnight.

This must have been the time that Java was only starting to come out. Unbeknown to us, we used some Java functions that were not available in the OS of our client. So our client had to upgrade their OS just to run our software. They ran into a few snags, but nothing serious.

OK, so that’s some back-story, what do I think of CDE now?

I think it’s … um … interesting. It looks a little clunky. Window decorations look too “fat”. So a lot of real-estate is being wasted. The terminal fonts are similarly clunky, but there is an option to reduce their size. It then looks neat.

In my opinion, the task panel (or whatever they call it) is too chunky, too. Some modern-day DE’s make that mistake, too.

I do not like the way that you can overlay windows on top of it.

On the other hand, it does have a certain whimsical charm to it. I think you would get used to it, and, dare I say it, even prefer it.

I think it also demonstrates that, even in the 90’s, desktop environments were mostly “there”; and that we have since been spending our time reinventing wheels and piling on the kilobytes.

Will it replace my LXDE? Realistically, no. The task panel in LXDE is better and works the way I want it to.

There’s possibly isn’t much that isn’t fixable, though. I applaud the efforts of the team that is bringing CDE into open source. It needs to mature a bit to make installation more seamless.

It’s also worth remembering that CDE is very lightweight, and I can see it being used by a few distros. I am given to understand that it does not work with Arch yet. I am sure that crowd will be particularly keen. There really ought to be a Slackware package for it, too. Come on Slackers, this is exactly the kind of retro stuck-in-the-90’s stuff that you should revel in.

In some ways, CDE reminds me how UNIX and their clones should be. None of that systemd and D-Bus baloney. GNOME seems to increase its resource requirements whilst simultaneously reduce its functionality.

Now that I put it way, it seems obvious where all the blame lays: Red Hat, possibly with special mention to Poettering.

OK. Now for some technical stuff. I present some notes that I made to compile CDE on Lubuntu 16.10. I thoroughly expect them to work on Ubuntu 16.10, too. Here it is:

 sudo apt-get -y install git build-essential g++ lib{xt,xmu,xft,xinerama,xpm,motif,ssl,xaw7,x11,xss,tirpc,jpeg,freetype6}-dev tcl-dev ksh m4 ncompress xfonts-{100,75}dpi{,-transcoded} rpcbind bison xbitmaps

Download sources from .
 I used cde-src-2.2.4.tar.gz

unpack, cd, etc.

make World
 sudo make install # doesn't seem to do much

export PATH=$PATH:/usr/dt/bin
 export LANG=C
 cd /usr/dt/bin


Unfortunately, my X-server crashed whilst I was taking notes, so they are unfortunately incomplete.

Be sure to check out this wiki page for more information and guidance.

Kudos to the CDE guys! I really do hope the project is a huge success.

Posted in Uncategorized | Leave a comment

$OUT.L – Outsourcery – from the Land of Penny Dreadfuls


Sometimes, for amusement, I like to read the posts of hvs over on ADVFN. Maybe his writing is, shall we say, “not everyone’s cup of tea”, but if he is negative on a share, then it’s a strong sign that the share is to be avoided.

I came across this post in April 2016, where he posted the comment “Is it not bankrupt?”

To be fair, most of the bulletin board participants seems to have figured out that OUT was toast, but there were still rampers making comments like “Moving up again. Massive interest” and “hugely undervalued and qith [sic] the news with Vodafone im sure this will rally”.

On June 2016 the company announced appointment of administrators. The shares are currently suspended, and I assume that current holders will receive nothing.

This is presumably not the result you would have expected if you read their interim results, issued on 29 September 2015 (

Outsourcery continues to secure a strong market position as a fully converged provider of cloud solutions with high profile partners and reference end-customers … Recurring revenue is increasing, and the Company is deliverying against a large addressable market … Gross margin has increased steadily

Subsequent proceedings were ominous, though:

  • On 6 October 2015, OUT issued an RNS stating that a co-CEO had purchased shares totally £5200. That does not seem to be much of an effort on his part, and I would naturally take that as a bearish, rather than bullish signal
  • On 4 November 2015, OUT announced the appointment of Ms de Sousa as Managing Director. More on that later!
  • On 24 November 2015, OUT announced a change of auditor. A change of auditor is grounds for suspicion. The auditors had been in place since 2011, so a reasonable man might argue that I am being over-cautious. In retrospect, though, it seems that the incumbent auditors were uncomfortable on signing off the numbers. I am guessing, of course, but with the benefit of hindsight, it is the most plausible explanation.
  • On 29 January 2016, OUT announced that de Sousa would no longer be taking up her position. Hmmm, what’s all this then? Apparently the decision was based on “family reasons”. It appears that she decided to have a baby, rather than join the gallant crew at OUT.
  • On 25 April 2016, OUT announced that it would release its full years results for the year ended 31 December 2015 by 30 June 2016. You have to ask yourself: why does such a small company need so long to prepare its accounts? The company did give unaudited figures, though, stating that revenues would be up 9%, and gross cash would be down by 64%. We are then given a killer sentence: “Whilst the full year results and current trading are in line with the Board’s expectations, this view factors in that revenue growth from the Company’s strategic partners has been impacted by further partner product launch delays outside of Outsourcery’s control.” Also “The Company will require further funding for short term working capital purposes” [emphasis mine]. Predictably, the shares tanked.
  • On 3 June 2016, OUT issued an update and suspension of trading. “Pursuant to the conditions of the working capital facilityWhile discussions with third parties are ongoing, they have had no material adverse impact on current business activity, with Outsourcery securing new direct customer contracts and continuing with further product is no longer possible to present audited financial results for the year ended 31 December 2015 by 30 June 2016.
  • On 15 June 2016, it announced the appointment of administrators, and the resignation of its nominated advisor

So basically, game over.

Be careful out there.



Posted in Uncategorized | Leave a comment

$DPLM.L – diploma – chugging along nicely

Industrial supplier DPLM (Diploma) announced its finals today (, sending its shares up 1.2% to 910.5p in early trading. Revenues were up 15%, profit before tax was up 5%, free cashflow was up 46%, and total dividends were up 10%. So everything is going in the right direction.

The Chief Executive commented:

Despite the current macro-economic uncertainty in the global environment, the Board remains confident that the Group will continue to make further progress in the coming year from a combination of steady GDP plus organic growth and a strong and successful acquisition programme.

Dividends have grown by 16% pa over the last 5 years, and more that quadrupled over the last decade. According to Stockopedia, its average ROCE over the last 5 years was 23.9%. It has next cash.

DPLM is part of my LTBH (long-term buy-and-hold) portfolio due to its combination of high ROCE, conservative financing, and steady increase of dividends. It is a “steady compounder”.

DPLM is rarely mentioned on the bulletin boards due to its lack of excitement value. Lack of bulletin board interest is usually considered a good sign. It does not appeal to value investors, as it is not a value share. It does not appeal to momentum investors, as you do not expect much momentum out of it. It is not a high-growth speculative stock which attracts the growth investors. It is not a big-cap, and it is not a small cap. So it slips under everyone’s radar.

And yet there it sits, quietly chugging along each year, pumping out ever-increasing dividends.

Readers with a good memory may recall that I sold out of DPLM a few years ago because the valuation looked a bit strength. It was a decision I rued. I bought back in nearly a year ago. The shares are up around 31% over 1 year, handily beating the Footsie, which is up 7% over the same period.

It is the kind of share that you have to resist selling. Stockopedia shows that its PE ratio is 19.4, which I think is at the upper end of what you can safely pay. It has a momentum score of 93, which is good, but I think it may have to trade sideways for awhile to let fundamentals catch up with valuation. It is just not a whizz-bang stock that attracts daft valuations.

I hold, and am happy to stand pat. Good long-term share for the patient.


Posted in Uncategorized | Leave a comment

#gnu spreadsheet #oleo resurrected!

Exciting news: anyone remember the old GNU spreadsheet oleo? The latest version was 1.99.16 was released in 2000:

The sources had suffered from severe bitrot, and no longer compiled. That is until I decided to have a go at porting it. I had made an attempt on a previous occasion, but basically made a right hash of it. I gave it another shot recently, and managed to get it working in under a day.

I modernised the configuration script, although the original Makefiles probably contained much subtelty that I have overlooked.

I am super-excited about this, as it means that a project that has been effectively dead for over 15 years now has a new lease of life. I have disabled X/Motif support – but there’s a good chance that it should be fairly easy (?) to get it working again. Currently, oleo works with ncurses. It would be really neat to have a spreadsheet that works in X.

I have called the project neoleo, which is a play on words. neoleo rolls off the tongue nicely as a word. The “ne” is Esperanto for “not”, so neoleo is “not oleo”. You could also interpret it as neo- as in new or recent.

I am interested in exploring how oleo compares in power with other command-line spreadsheets like sc and teapot. It will be interesting to see if I can get the Motif interface working again. Users will then have a GUI. Whilst nowhere near as featureful or as pretty as LibreOffice or Gnumeric, it is vastly more lightweight. The basic ncurses version requires no extra libs other than ncurses itself, which is of course standard on UNIX.

Anyway, download, compile, enjoy:

I’d be interested in hearing from anyone who wanted to patch the official oleo release.

Posted in Uncategorized | Leave a comment

$DTG.L – Dart Group – good half-year report

Airline, holiday, and logistics operator $DTG .L (Dart Group), issued its half-year report today (, sending its shares up 8.2% to 436p at the close. Revenues were up 21% over the comparable period. Basic EPS was up 14%. The interim dividend is up 53%, showing confidence in the direction that the company is going. Load factors were slightly lower, but they did have increased seat capacity. They state:

Whilst we recognise the likely upward pressures on market pricing following the weakening of Sterling post Brexit; for the long term, we have confidence in the resilience of our Leisure Travel business and are encouraged by the increasing proportion of customers taking our great value, real package holidays. With winter 2016/17 Leisure Travel bookings continuing to perform in line with expectations, the Board is currently optimistic that market expectations for the full year will be slightly exceeded.

The shares have been a disappointment over the last year, with the share price down 4.8%, whilst the Footsie is up 11.0%. However, Stockopedia reports that DTG stands on a PE of 10.5, which is cheap considering that it is still very much a growth company. Perhaps fears of its cyclical nature and potential vulnerability to economic events have held the share price down. Given the positive statement today, and the market reaction to the results, I think a re-rating of its shares are in order.

I have been a holder of DTG for a few years now.


Posted in Uncategorized | Leave a comment

Oh wait, now #lisp is faster than #cobol

So I tweaked my input-parsing code in both Lisp and COBOL, and switched to SBCL (Steel Bank Common Lisp). When compiled, here are the running speeds of my programs:

bash 0.289s
cobol 0.260s
sbcl 0.150s
C++(++) 0.411s

My COBOL code is not as complete as the Lisp code, as it has some experimental sections. It uses STRING and UNSTRING a lot. STRING functionality was a recent addition, and it seems to completely kill performance. The COBOL code is not even processing all the inputs.

So far then, COBOL is not performing up to the level of a naive bash script I wrote. The bash script is shorter.

You might be asking about how comes the C++ program takes so long. Well, the C++ is the main calculating engine, and it calls the bash command as a preprocessor, interprets the output from it, does the actual complicated processing, and finally calls a post-processing bash script. It is doing much more work than all of the other programs.

So at least 0.289s of that 0.411s is just running the preprocessor. The sensible next step would be for me to try to cut out the preprocessing script entirely, and subsume it in C++. It seems reasonable to suppose that the time saved would be very high.

Posted in Uncategorized | Leave a comment

#cplusplus meets #cobol and #lisp

Call it madness, but I have a fascination with trying to transcribe my C++ accounts program to COBOL.

I have an idea of a kind of interoperative facility, where I can code in a mixture of both. As a first stage, I have written a small program in Lisp which contains a record schema and an input converter. Here is the schema:

(defparameter *amount* ‘(amount dec 12 2))
(defparameter *dstamp* ‘(dstamp str 10))
(defparameter *quantity* ‘(quantity dec 12 3))
(defparameter *ticker* ‘(ticker str 6))
(defparameter *yahoo* `(,*dstamp*
(tstamp str 8)
(rox dec 6 4)
(price dec 12 6)

(defparameter *record-descriptor*
`(( “comm-1” (,*ticker*
(dload str 1)
(typ str 4)
(unit str 3)
(name str 22)))
(“etran-1” (,*dstamp*
(folio str 4)
(amount dec 12 2)
(way str 1)
(taxable str 1)
(desc str 20)))
(“leak-1” (,*dstamp*
(folio str 4)
(taxable str 1)
(desc str 20)))
(“nacc” ((dr-acc str 4)
(cr-acc str 4)
(acc-type str 1)
(mult dec 1 0)
(desc str 20)))
(“nb” ((desc str 20)))
(“ntran” (,*dstamp*
(dr-acc str 4)
(cr-acc str 4)
(desc str 20)))
(“yahoo” ,*yahoo*)
(“yahoo-1” (,@*yahoo*
(desc str 20)))


I have written some Lisp functions that takes in free-form input, dispatches on the type of record, and produces a fixed string line output. The next step would be to see if I could generate COBOL copybooks and C++ structures from the schema.

On the COBOL side, this should make reading in the data fairly straightforward, as it can transfer a line from input directly into the copybook structure.

I think C++ would be trickier. It doesn’t have fixed-length strings or structs that I can simply serialise and deserialise.

On the other hand, if I want to cut Lisp out of the loop, it will be more difficult to persuade COBOL to convert raw input into record format. I had solved the problem a long time ago on C++.

My COBOL is virtually non-existent, so there’s certainly a learning curve in me massaging it into behaving the way I want it to.

My Lisp is fairly weak, but to a large extent, you don’t have to know much Lisp. I found myself creating the requisite macros quickly. The two points that stood out to me is that I had to spend a long time on the formatting routines. The format statement is powerful, but somehow it did not seem powerful enough. I had to wrap it in a few convenience functions to obtain the output I wanted.

What I also noticed is that Lisp (I am using clisp) is a slow old beast. It took nearly 0.5 seconds (clock time) to process the input. My guess – and it’s a guess shaped from an anlogous experience – is that C++ would be able to process this in a tenth of the time probably at most, and I imagine COBOL would be somewhere in that region, too. The notion, as some Lispers would claim, that Lisp could be “just as fast as C++” seems fanciful.

COBOL is a strange language. It is, at once, an easier, and more difficult, language to use.

The experiment continues.

Posted in Uncategorized | Leave a comment