My goal is to use #dlang in 2022

For a few years now, I’ve held the view “stop futzing around and just use C/C++ or Python”.

Recently, I had been poking around with Zig, and it seemed a nice language. It got me to wondering if I should follow the advice on the Interwebs and learn a language a year. After all, we all know that the internet never lies, and is full of sagely advice.

But which to learn? My “likely” candidates were: Zig, Golang, D. My “unlikely” candidates were: FreePascal, Odin, Pony, Nim, and Rust.

I like to play with microcontrollers, so my thinking is biased towards a system-level language. That generally means “no GC (garbage collection)”. That would rule out Golang, with vociferous debate from those on the autism spectrum as to whether or not it would rule out D and Nim.

Having used C++, I take it for granted that memory management is pretty much a solved problem. If you want data to be immutable, then pass it as a const. No need for Rust.

Obviously, I’m going to burn for my dismissal of Rust. I’ve tried mucking about with it for my microcontrollers, but never got it to work. OK, I think I got some blinky lights working once, but was unable to repeat the trick months later. No doubt it’s something I did wrong, but I found Rust was never as straightforward as it should be.

I mentioned in a previous post that I had tried to get someone’s app written in Rust compiled. It failed to compile. I expressed surprise at this, as I thought it used pinned library versioning, so there was nothing to go wrong.

Rust is a game that never really seemed to be worth the candle, in my eyes. A lot of other people’s mileage differs, though. I am convinced, though, that Rust should be kept well away from the Linux kernel. I am surprised that Linus Torvalds is so sanguine on the idea. I heard that the Debian team is having problems compiling Firefox for Stable, on account of its stale Rust compiler.

Introducing Rust into the kernel adds yet another dependency, with what seems to be a bit of a moving target. Rust has too much hype. Wait until everyone is bored with their borrow-checkers as everyone is with C++’s multiple inheritance before considering integrating it.

Actually, Pony seems to have a very interesting way of dealing with safety and concurrency. It has about half a dozen way of declaring the type of memory (box, iso, ref, etc.), but I’m afraid my eyes glazed over half way through the explanation. Pony uses an actor model.

I think the general idea is that an actor acts like a “task”. You can make data mutable within that actor. Actors can call other functions and “behaviours” (like functions but they work asynchronously), and pass data. The type of memory that a function receives is different from the one sent. The upshot is that you can perform concurrency in a safe and provable way at compile time.

I had been thinking for some time about the whole idea of being able to ensure the correct sequencing of operations in a program. Every function can call any other function. It’s what Pony calls an “ambient” “capability”. So when anything can call anything, you of course get things being called incorrectly. When I heard about the way Pony handles memory, I thought they had tapped into a very rich vein of establishing program correctness. It’s far more subtle than Rust’s “mutability is bad, mmkay.”

I also found Zig interesting due to the fact that it has compile-time execution. That means that there are things you can prove or disprove at compile-time that you couldn’t do in, say, C++, or even in Rust.

I decided not to rule out garbage-collection, though, which opened the door to D, Nim and Golang. I figured that Linux is not an RTOS (real-time OS), so I’ll never get real-time behaviour anyway. So then the question is “does GC buy me a higher-level language?”

With great features comes great opacity. You’re left with a design decision: is it a price worth paying? I heard one guy say that C++ was a terrible language because people who programmed in it “didn’t understand their language.”

Well, take strings and vectors in C++. I don’t know what happens under the hood, but they’re sure to mallocing, reallocing, and deallocing. Facebook apparently built their own string alternative, which sped up their string processing by 1%. Facebook considered this a huge win.

But think about it. FB poured a lot of effort into something that I would have considered a marginal gain. What it tells me is that despite the mysteries of what string does under the hood, I am unlikely to be able to write a better string class myself.

Now, if I were to program a microcontroller, I’d probably be wanting to avoid string like the plague. But if I’m writing an application to run on an OS, I’d be kinda mad not to make use of standard C++ library.

So I think it’s important to beware of Holy Wars in programming features. Like the cenobites, they’re an angel to some, demons to others.

Time to take out the trash, Kirsty.

Then, earlier today that D has a grammar module built in. How cool is that?

Well, how could I now resist giving a language like that for a spin? Unless its Raku, of course, which is much too slow.

D seems to have a pretty straightforward C interop. I wanted to see if I could get blinky lights going with D on my Raspberry Pi. I need to link in with the bcm2835 module.

Here’s the code:

import std.stdio;
import core.thread;

extern (C) int bcm2835_init();
extern (C) int bcm2835_close();
extern (C) void bcm2835_gpio_set(ubyte pin);
extern (C) void bcm2835_gpio_clr(ubyte pin);
extern (C) void bcm2835_gpio_fsel(ubyte pin, ubyte mode);

void main()
{
        ubyte pin = 26;
        bcm2835_init();
        bcm2835_gpio_fsel(pin, 1); // output

        foreach( i; 0 .. 10) {
                bcm2835_gpio_set(pin);
                Thread.sleep( dur!("msecs")( 100 ) );
                bcm2835_gpio_clr(pin);
                Thread.sleep( dur!("msecs")( 900 ) );
        }
        bcm2835_close();
}

To compile:

gdc blink.d -o blink -lbcm2835

Admittedly the code wouldn’t be any longer in C, but I was pleased that I was able to get a blinky working with very little fuss.

There’s a nice touch, like the foreach statement. In C, it’s more of a chore. D seems to have fairly intuitive syntax, so I’m looking forward to taking my journey into D further.

As an aside, the blink sketch I wrote works without the need for sudo. My C version requires sudo, or else gives a segfault. I have no idea how the former could be possible when the latter isn’t; but in the words of the great sage Homer: “mmm, free goo.”

Anyway, that’s me been rattling on long enough. Have a happy new year, everyone.

About mcturra2000

Computer programmer living in Scotland.
This entry was posted in Uncategorized. Bookmark the permalink.

1 Response to My goal is to use #dlang in 2022

  1. Guest says:

    I started with D a few years ago, coming from C and C++. Currently I am using D for everything.
    To be honest, I fell in love with the language.
    Goodluck with your venture. You will not regret it! 🙂

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s