Magic Hat – SPO in, XLM out

The MHP (Magic Hat Portfolio) on Stockopedia (http://www.stockopedia.com/fantasy-funds/magic-hat-463/) is an experiment by me to see if a human can improve on a mechanical Greeblatt Magic Formula screen. I am trying to weed out “mistakes” that I feel the screening commits: unseasoned companies, scams, foreign companies (particularly Chinese), fishy accounting, and statistical quirks. Apart from that, I am agnostic as to the sector the company operates in, although I will try to avoid heavy concentration in any one sector. I will mostly apply “Strategic Ignorance”, by which I mean that I wont try to be clever in my stockpicking. My picking will be mostly mechanical. A summary of most of my Magic Hat articles can be found on the web page http://www.markcarter.me.uk/money/greenblatt.htm This will allow you to see, at a glance, what shares have been bought and sold in the past, as well as what shares have been rejected from consideration and why.

Happy new year to you all. Mr Market has been generous in 2017. The MHP has had a grand performance, returning 33%, as opposed to the market of 8%. Throughout its existence, the MHP has been chugging away outperforming the indices in a slow and steady manner. 2017 was an exceptional year for the portfolio, and I am inclined to put it down to a fluke. The outperformance for the bulk of the rest of its existence was far more modest.

According to StarCapital, the CAPE of UK at the end of September 2017 was 15.7. That is “about average”, maybe a little under. So we can be sanguine about the possible direction of the market. Unpredictable events may well determine the overall course of the market. We will only know these things with the benefit of hindsight.

The CAPE of the US markets is at an eye-watering 29. I see plenty of risk there. Russia, on the other hand, has a CAPE of 5.6. This valuation is absurdly low. I have been invested in JRS (JPM Russian Securities) for two years now, up 59% since I bought it. It has a dividend yield of 2.6%. Given the current valuations, I will continue to hold.

Emerging Europe as a sector is on a CAPE of 8.9, which looks good value to me. I have some shares in a couple of trusts that invest in this area.

Singapore, as a developed market, looks to be on safe valuations, with a CAPE of 12.9 and a PE of 11.0. It’s a pity that there aren’t any trusts that specialise in that country.

But back to the MHP …

XLM is kicked out by rotation, having returned 99%. Excellent. It has a Stockopedia StockRank of 90, so there would be no reason to sell in a panic. It has a pretty good chart, although its price is arguably overextended on its 50dMA, so a pull-back may be in order at this juncture.

Pool betting operator SPO (Sportech) enters the portfolio. It has a StockRank of 96. So let’s see how that pans out.

That’s all for this month. Stay safe out there.

Advertisements
Posted in Uncategorized | Leave a comment

My opinion on Damian Green

According to a recent BBC article (http://www.bbc.co.uk/news/uk-politics-42193826), Damien Green, described as one of the PM’s closest allies, has been sacked from the cabinet. An investigation found he breached ministerial code. The key issues seem to be:

  • the discovery of legal pornography on his computer, which appears to have been a direct breach of the code
  • an incident with Kate Maltby

I will start by saying that watching pornography is not immoral. You should not regulate or pry into people’s private, legitimate, affairs. As a public figure, he should, of course, expect greater scrutiny, and should conduct himself with more decorum than the general public. However, you have to draw the line somewhere, and seeings as his taste for pornography caused no public harm, I think it is wrong to vilify him.

Let’s be honest, sex is in itself a sordid, animalistic act, if you want to put it that way. It’s even more-so that pornography. After all, with pornography, you’re just looking. Should we therefore sack all people who have had sex? Of course not.

Let he who is without sin cast the first stone.

Next, we have the Kate Matlby thing:

The 31-year old [Kate] claimed the minister “fleetingly” touched her knee in a pub in 2015 and in 2016

Sorry Kate, but you’re going to have to come up with something more substantive than that if you want to ruin a man’s reputation.

The article continues:

sent her a “suggestive” text message which left her feeling “awkward, embarrassed and professionally compromised”.

OK, so what does that mean? Was it merely flirtatious, which she found unwelcoming; or was it predatory? There’s a difference.

Her statement is so vague and insinuating that, by itself, there is no reasonable grounds for censoring Green.

Here we run into the problem of “pussification of the west”. Adults are reduced to infants, unable to handle small trifles. It is narcissistic and self-entitled. It is an insidious evil – and make no mistake, it is an evil – where people are harmed merely by vague accusation or act of slight.

We now have policies of Zero Tolerance throughout the workplace.

Let us not forget some basic principles of law: de minimus non curat lex (the law does not concern itself with trifles) and culpae poenae par esto (the the punishment fit the crime).

You can see that Zero Tolerance, as an idea, doesn’t stack up well against principles that have been established over many centuries. Zero Tolerance lacks any leeway, the application of common sense and reason, judgement and a sense of proportionality; and really, a sense of justice. This is the legacy of Cultural Marxism. It is regressiveness parading at progressiveness. Political correctness gone haywire.
Great harm will come from this.

Update 21-Dec-2017: I’ve just seen a bit more on the BBC. Apparently, the chief cause of concern was the statements that Green made to the police about the possession of pornography. It seems that the possession itself was not the issue, nor was the matter with Kate. It seems that May’s hands were effectively tied on the issue (figuratively speaking, of course): lying to the police was a serious breach of the ministerial code which she could not allow to go unanswered. Although one does wonder as to why the police were snooping around Green’s laptop. So May’s actions seem justifiable, based on this reading.

Posted in Uncategorized | Leave a comment

Exploring byte-code compiler using #cplusplus

bcode

Let’s build a byte-code compiler.

Motivation

There are several factors that have led me to writing “bcode”, my attempt at making a byte-code compiler:

  • my general interest in compiling technology
  • my involvement with neoleo, a spreadsheet that I forked from oleo, which is GNU’s spreadsheet program that has not been updated for many years. Part of the program parses user formulae, compiles and decompiles them into byte-codes as required. Lex and yacc are used. Oleo was originally written in C, and I was interested in converting it to C++. The workings of the compiler and parser were, and to some extent still are, a bit of a mystery to me. I figured that there must be a better way to do it.
  • my involvement with ultimc. Ultimc is a project that I created to work in conjunction with Ultibo, a unikernel for the RPi (Raspberry Pi). I figured that it would be nice to have a scripting environment for the unikernel, and set about creating Glute (“gluing utility”, although I like the name because it was vaguely rude). Although it currently “works”, it can only deal with commands. You cannot perform branching, or define functions.
  • I was inspired by Jack Crenshaw PhD’s series of articles, Let’s build a compiler to try my own take on the subject. Jack creates a compiler that spits out Motorola assembly. The assembly can then be compiled into an executable, assuming that you have access to the Motorola chipset
  • J.P. Bennett’s book “Introduction to compiling techniques – a first course using ANSI C, LEX and YACC” was of interest to me. I bought the book in 1996. I never made my way through all of it, although I occasionally dipped in, latterly more than formerly. I am glad I bought the book, as it is a pleasant introduction to compiling techniques.

Bennett’s work adopts the typical approach to writing a compiler:

  1. define a lexer using lex. He discusses ad hoc lexing techniques, where you basically roll-your-own lexer
  2. analyse the syntax using yacc. He also discusses top-down parsing.
  3. produce an intermediate representation of the compiled code. He chooses an idealised assembly language
  4. the idealised assembly language or intermediate code can then be compiled to native machine code. Bennett actually creates an interpreter for the assembly language, which is a perfectly reasonable choice.

Method of attack

I have this to say on compiler construction tools:

  • I would rather avoid code generation. Ideally, the code should just work, without any need for tool intervention. I think a lot of the drawbacks come from the limitations of C itself
  • they’re too complicated. This is a contentious point, I know. Everybody’s mileage varies on this. You have to learn them, and it’s difficult to know if they create more problems than they solve
  • the tools don’t especially fit harmoniously together. I always feel that it’s like trying to push two magnets together with the same polarity head-to-head. You can do it with sufficient force, but I would rather not.
  • they seem more designed to the C era of programming. I want to write in C++, where encapsulation is better, and I feel I have more certainty about computer memory hygiene (no leaks). I noticed that with neooleo, for example, it is difficult to reason about memory. I have ideas about how it could be better, although it is difficult to realise them given the nature of the tools I have

I think lex and yacc can be considered obsolete for the following reasons:

  • Rob Pike, of Go, Unix and Plan 9 fame, seems to prefer hand-rolled lexers
  • hand-rolled lexers are OK, but I think they require finnicky bookkeeping as to token state and the state of the input stream
  • if you don’t want to go down the lexing route, and hand-rolling a very low-level lexer is too much fiddle for you, and you are willing to sacrifice some efficiency, then you can use C++ regexs. By defining a list of them, and checking against them, you can effectively achieve what lex does, but without the hassle of using a separate tool. My blang project uses such an approach, and I think it completely obviates the need for lex
  • nearly all compiler writers today seem to use recursive descent compilers, having abandoned yacc.
  • recursive-descent compilers also fit in well with humans intuitvely

I have need of an interpreter, as opposed to a compiler. I am willing to sacrifice speed, and I am not looking to perform any optimsations. I had originally thought that a good idea would be to avoid writing any kind of byte-code compiler, and just walk through a parse tree as needed. I think it could be reasonably efficient. One of the real problems with compilers is that they rely on a lot of different “types”. Types include things like integers versus strings, or function blocks versus arithmetic. The compiler needs to work with variants of data types. This is commonly called ADTs (Abstract Data Types), or Sum Types. This is in contradistinction with Product Types, or more commonly called “the struct”. Product Types are widely supported in many languages. ADTs rarely feature in language designs. The notable expections are programming languages like Haskell, which excels at using ADTs. Lisp-like languages are also well-suited to ADTs, as they can deal with data representations dynamically. For a language like C++, it always felt like trying to put a square peg in a round hole. The latest version of C++ 17 does actually provide the variant type. Finally! As of writing (Dec 2017), full C++ compilers have not yet made their way into popular Linux distributions, though. I also wonder if C++’s solution is entirely satisfactory. It has chosen an object-oriented approach to dealing with variants, whereas I wonder if they would best be dealt with by syntactic extension. Time will tell.

Given the above factors, I am now think that byte-compiling is an idea that I want to explore. So my idea is that the parser, instead of constructing a parse tree, just spits out byte code. In Tcl, as the saying goes, “everything is a string”. It is a data-hiding mechanisem. I think it is a good approach for compilers, except that for “string”, read “series of bytes”.

So this is the approach I will adopt here:

  • instead of trying to compile a full-blown language, I will try to write a compiler for a simple assembly language. And I’m not kidding here, the parsing should be as simple as possible
  • that assembly will compile to byte-code for an “idealised” stack machine
  • the byte-codes will be interpreted

I think this is a good approach. You can compile a fully-fledged language by emitting those assembly instructions. The nature of writing a top-down parser is that you can generate assembly instructions in postfix form, and you do not have to construct a syntax tree. The tree is implicit in the top-down recursion. You don’t have to construct a tree explicitly. This is great, because it means that you don’t have to store variant-type intermediate structures, you just emit assembly code as you go, in the manner adopted by Crenshaw.

The code

I am trying to write my assembler piecemeal, creating features as we go. I do hope I make it to the end. So here are the parts:

Part 1 – halting, pushing and extensible execution, no branching

 

References:

  • The above document above is available here.
  • For more details, see part 1, where I get into the instruction set and implement it in code. Branches are not yet implemented. I intend to work on them next.

Enjoy!

Posted in Uncategorized | Leave a comment

Magic Hat – DTG stays in

The MHP (Magic Hat Portfolio) on Stockopedia (http://www.stockopedia.com/fantasy-funds/magic-hat-463/) is an experiment by me to see if a human can improve on a mechanical Greeblatt Magic Formula screen. I am trying to weed out “mistakes” that I feel the screening commits: unseasoned companies, scams, foreign companies (particularly Chinese), fishy accounting, and statistical quirks. Apart from that, I am agnostic as to the sector the company operates in, although I will try to avoid heavy concentration in any one sector. I will mostly apply “Strategic Ignorance”, by which I mean that I wont try to be clever in my stockpicking. My picking will be mostly mechanical. A summary of most of my Magic Hat articles can be found on the web page http://www.markcarter.me.uk/money/greenblatt.htm This will allow you to see, at a glance, what shares have been bought and sold in the past, as well as what shares have been rejected from consideration and why.

Airline operator DTG (Dart Group) is due for ejection this month. It is no longer in the Greenblatt screen. However, its Stockopedia StockRank is 94, it is classified as a SuperStock, and it has at Greenblatt rating of A-, rather than A+. I’m going to be lazy, and keep it in.

Taking a look at the chart of the fund’s performance, I see that it has performed exceptionally well against the FTSE 350. I should note a couple of caveats, though:

  1. although the fund has outperformed the index fairly consistently, the really superior gain has been over the course of 2017. This somewhat lumpy gain could be just luck.
  2. the FTSE 350 is heavily weighted to the Footsie. The FTSE 250 has performed exceptionally well against the FTSE 100 over the course of the existence of the fund, so the fund’s performance would look far more modest if it was compared against the mid-caps.

So, that’s it for this month. Have a merry Christmas, and let’s hope we get lots of mild weather.

Stay safe out there.

Posted in Uncategorized | Leave a comment

#awk the underappreciated

At Softpanorama:

Most people are surprised when I tell them what language we use in our undergraduate AI programming class. That’s understandable. We use GAWK. GAWK, Gnu’s version of Aho, Weinberger, and Kernighan’s old pattern scanning language isn’t even viewed as a programming language by most people. Like PERL and TCL, most prefer to view it as a “scripting language.” It has no objects; it is not functional; it does no built-in logic programming. Their surprise turns to puzzlement when I confide that (a) while the students are allowed to use any language they want; (b) with a single exception, the best work consistently results from those working in GAWK. (footnote: The exception was a PASCAL programmer who is now an NSF graduate fellow getting a Ph.D. in mathematics at Harvard.) Programmers in C, C++, and LISP haven’t even been close (we have not seen work in PROLOG or JAVA)

Full version on Ronald Loui’s webpage.

Posted in Uncategorized | Leave a comment

#Ultibo – a “unikernel” for the #raspberrypi

I just came across Ultibo , a “unikernel” for the RPi (Raspberry Pi), any version. What a fascinating idea it is! If you are like me, you have no idea what a unikernel is. Wikipedia describes it as:

a specialised, single address space machine image constructed by using library operating systems.[1][2] A developer selects, from a modular stack, the minimal set of libraries which correspond to the OS constructs required for their application to run. These libraries are then compiled with the application and configuration code to build sealed, fixed-purpose images (unikernels) which run directly on a hypervisor or hardware without an intervening OS such as Linux or Windows.

From what I gather, a unikernel is designed to run a single application on virtual hardware; to be contrasted with an exokernel, which is designed to run multiple applications that needn’t know of each other’s existence, on real hardware. And to be contrasted yet again with a kernel, like Linux or Windows, that provide a whole host of services.

You write your application in Lazurus, which is an IDE for the FreePascal compiler. The version of Lazurus is a custom version. I’m not Pascal wizard, but I was able to get up a “program” that created a console, asked for the user’s name, and printed it back to the console. I say “program”, but it is actually the complete guts of the machine. There’s no separation between program and kernel.

Writing in Pascal makes it much more pleasurable to program in than C. For example, Pascal has proper strings.

When you build the project, Lazarus creates a kernel.img file. You then copy that to your SD card, insert it into your Pi (I chose a Pi 2), and boot your machine. It boots straight into your program.

Here’s a simple program that I wrote:

program Joy;

{$mode objfpc}{$H+}

{ Raspberry Pi Application }
{ Add your program code below, add additional units to the "uses" section if }
{ required and create new units by selecting File, New Unit from the menu. }
{ }
{ To compile your program select Run, Compile (or Run, Build) from the menu. }

uses
RaspberryPi,
GlobalConfig,
GlobalConst,
GlobalTypes,
Platform,
Threads,
SysUtils,
Classes,
Ultibo,
Console,
Framebuffer
{ Add additional units here };

var
WindowHandle:TWindowHandle;
name:String;
begin
{ Add your program code here }
WindowHandle:=ConsoleWindowCreate(ConsoleDeviceGetDefault,CONSOLE_POSITION_FULL,True);
ConsoleWindowWriteLn(WindowHandle,'Hello Ultibo! What is your name?');
ConsoleReadLn(name);
ConsoleWindowWrite(WindowHandle, 'Your name is ');
ConsoleWindowWriteLn(WindowHandle, name);
ThreadHalt(0);
end.

It creates a console, asks you for your name, and prints it back to the console. Then stops. Not, perhaps, the most useful of programs, but I was impressed by the ease of it.

I have no idea how the authors of Ultibo manage to get all this to work, but it works quite well.

The work is reminiscent of Charles Moore’s colorForth, and Niklaus Wirth’s Oberon OS.

Ultibo is well suited to those who like tinkering; so those interested in an RPi should find it very appealing. The RPi is an excellent platform to target, too.

I look forward to delving deeper into the system. I would like to try to build its version for Linux. Ultibo currently runs from Windows. Lazarus runs on all platforms, so I think that it should be possible to compile a version for Linux. Theoretically.

Anyway, worth checking out.

EMUZ80-RPI.png

Image taken from the web

Posted in Uncategorized | Leave a comment

Review of #elegoo basic start kit

Recently I decided to buy the “Elegoo UNO Project Basic Starter Kit with Tutorial and UNO R3 for Arduino” from Amazon. It cost £13.99.

For those who may be unaware, the small board electronics niche has two undisputed leaders: the Arduino, and the RPi (Raspberry Pi). The Arduino is a microcontroller board, whilst the RPi is a microcomputer board. The Arduino can be used for electronic projects, i.e. controlling electrical circuitry, whilst the RPi is basically a complete computer with which you can also create electronic projects. The Arduino costs £28, and the RPi3 costs a fraction more, at £32. Given the sheer scope of what you can do with the RPi, the advantage generally lies with the RPi, at least in my opinion.

Although the Arduino is quite expensive considering, it is an open hardware platform, and there are plenty of clones out there. One very popular one is produced by Elegoo. Elegoo is designed and produced in China, which is often a bit of a warning sign.

However, the Arduino comes as just a basic microcontroller on a board, whilst the kit I purchased is half the price of that, and comes with leads, a breadboard, resisters, LEDs, pushbuttons, a tilt detector and photometer and a CD explaining how to use the kit.

Being Chinese, the quality is not up to the standards set by Western companies. I inserted a lead in the breadboard, for example, and when I came to remove it, the lead’s pin broke free and remained stuck in the breadboard. So I had to remove the pin manually.

Despite these glitches, the kit worked well otherwise.

The instruction manuals that come with the Elegoos are often derided. However, I found the CD to be well-written and informative. They appear to have been written by someone for whom English was a native language. I felt that the instructions were close to the standard that you would find in, say a CamJam EduKit. So don’t dismiss the manuals, as I found them educational. Use them!

Would I recommend this kit? Despite my seemingly somewhat negative previous comments, I would say “definitely yes“. The price is unbeatable, you get bits an bobs to play with, a set of instructions. This makes it an ideal introduction to electronics.

Remember, an Elegoo is a clone of the  Arduino UNO, so it can do eveything an Arduino can do, in exactly the same way. Throw in a selection of components and an instruction manual at half the price of a bare board, and what’s not to like?

elegoo.jpg

Lots of blinkenlights going with the basic starter kit

Posted in Uncategorized | Leave a comment