What have you done for me lately? (Feb 2023)

After a few previous editions (January 2019, September 2018, and December 2017) I’m back with another recap of all the things I’ve been doing instead of writing blog posts. Without any ado, let’s dive into stuff I’ve been up to!

Blog posts

Same as last time, although I’ve not been blogging much here on my personal blog (only one post, about my Book List), I have been writing content for my employer’s blog:

Not as many blogposts as previous years, but still a handful!

Advent of Code

The yearly advent calendar of programming puzzles deserves its own section. Three things revolved around this event for me:

Plus I got to craft fresh Egg Nog during Infi’s Meetup around Advent of Code, which is always nice!

Projects

Then there are a handful of full-on side projects:

Sketch notes

I’ve been doing several more (amateur level) sketch notes of Tech Events, here’s a few links to them (on Twitter only, for the moment):

Now that I post those links, thinking about the state of Twitter, I should probably also give my sketch notes a more permanent home where I am in control of the data.

What’s next!?

Those were the things I’ve been doing (instead of blogging) in the recent past. I find that I actually enjoy blogging only once in a while with recaps of all the weird side projects. So, up next, are probably a bunch more side projects!

Plus, I write this post at the start of an 8-week sabbatical. So who knows? Maybe that gives enough room for one or two side-projects!?

Book List – Update January 2022

Ten years ago I started a book list, and in 2014 I provided an update. After some years of radio silence I’ve updated it once more: check out my Book List Page! And let me tell you a bit about the diff between then and now in this post.

First up, here’s a quick glance of the current list:

Screenshot of 2022 Book List covers of recommended books. Total 20 covers a bit blurred together, some jumping out because of their cover (e.g. "Don't Make Me Think", "Culture Map", "Death March", "Influence", "TED Talks").

I’ve reorganised the list once more. Shown above is my current list of “Recommended” books. These books more or less follow these guidelines:

  • I’ve read them, and enjoyed them thoroughly;
  • I’ve learned something from them;
  • They are directly or indirectly helpful for a software developer (or related);
  • They are important to read, for one reason or another;
  • They’re timeless, at least to a degree.

All of those books are also described with a mini-review for each. Some of the reviews dating back from 2014 or earlier. Some of them written just now, for the updated list.

My Book List also has two extra categories. First, there’s “Recommended, with caveat“: books I think are (usually) good to read, if you know the caveat. Then finally there’s a list of books I’ve read and I have not regretted doing so, but they are not on my “top” list: “Good, but not great“.

Curious to know when my next major update to my book list is forthcoming? So am I!

What have you done for me lately? (Jan 2019)

Here’s yet another post similar to ones from September 2018 and December 2017: a recap of things I’ve been busy with in stead of writing more blog posts. It includes several blog posts I’ve written (and possibly should’ve cross-posted here?) for my employer: Infi.

One special thing of note is my previous blog post from two years ago, about “Reproman”. I had much energy and a great plan for a grand new project. First, it somehow didn’t “click” for me… and then a pandemic hits. And also my job title changes at the same(ish) time. So that project didn’t go as planned. I can firmly declare it frozen (if not dead) at this point. There, I’ve said it.

With that out of the way, let’s dive into some things that did happen since that last post.

Blog posts

I have not blogged here, on my personal blog, for two years. That doesn’t mean I haven’t been writing some new content! Let’s start with blog posts I’ve written for my employer’s blog:

Emotionally, it feels right to “restart” the list with all the stuff from “before the pandemic started“:

So no shortage of inspiration for things to write about, as you can see! Just on a different blog, is all.

I will say I’m considering pushing myself to write more frequently here, on my personal blog. But we’ll see.

Projects

In addition to writing for Infi’s blog, I’ve also been busy on a few projects. As always, you can find the more interesting ones on jeroenheijmans.nl, but here’s a couple of highlights:

That’s just a few highlights though, check my homepage for all different projects I’ve been running.

Various

Relevant to how many blog posts I can write, and how many projects I’m able to produce, is of course context. In software development, “it depends” is the standard state of things, right? It makes sense to share a few of the prime influences.

First up, since 2020, I’m not only a Technical Lead at Infi anymore. I’ve also signed up to be part of the management team as CTO of Infi Utrecht. I consider getting management responsibilities a demotion rather than a promotion, yet I feel compelled to spend part of my time making and keeping Infi a place I want to work at.

Second, of course, is the pandemic. In addition to having just about all possible privileges already, in the pandemic I also had no kids, had a great partner, and a relatively stable job. This makes (I presume) things a lot easier to handle. But still, it affected me regardless. I’ve mostly been acknowledging that, and acting accordingly. I hope you all can do so too!

Finally

In conclusion: what’s next!?

Well, I consider writing part of my job. So if you follow me on Twitter you’ll at least see my posts I wrote for Infi circulate. In addition I might write a bit more frequent here too. And if not, I’ll be sure to write another one of these “round up” blogposts in some months or years.

We’ll see.

Reproman

I’m starting a new project, working title “Reproman“. The idea’s been sitting in my mind for months now, but I found no good way to start yet. So I decided to start here: by blogging about it.

But first something about how and why I got to this point.

Basically, I’d love to specialize in something. Because if I’d specialize, I’d also get a clearer path to speaking at events more often, which is something I’d love to be doing.

On the other hand, I hate specializing because I’m at my best when I get to be a Jack of Many Trades. I enjoy work and hobbies the most if I get to focus on things in short bursts. Becoming one of the best in a specific programming language, or being able to bake the most perfect brownie: it’s just not for me. I want to be good at those things, sometimes even great, and then move on to become good at other things too.

What I realized several months ago: I need to combine these two things!

I have specialized in not being specialized at all. I specialized in learning new things; in getting good at something fairly quickly. So what will do is try to help other people acquire new skills, at first specifically around technology. This will involve providing content around at least these things:

  • Formulating Issues. Analyzing a problem you have, asking a question, getting help from a colleague, or submitting a bug or feature request: these are crucial when learning new things.
  • Creating Repro’s. To be able to create minimal conditions, and demonstrate something (e.g. an issue) under those conditions: this is essential when learning new things.
  • List Making. Being able to summarize something, and (importantly!) knowing what you don’t know (yet): this is vital when learning new things.

I’m sure that once I start analyzing how I and others learn new skills in technology, I will come up with more. But you’ve got to start somewhere. And I will start with the above.

Next Steps

So, what’s next with this project?

The most fun way for me to convey these things are in-person: by talking about it. Of course, I get my fare share of this at work in projects. But it would be fantastic if I could get a chance to speak at events about these things. Hopefully I’ll be able to tailor them to specific scenarios, contexts, and technologies.

But before I’m there I need to get my story straight.

Most importantly, I want to write about these things in some form. Possibly in the form of blog posts, or manuals, or a mix of the two. I’m considering starting a specific blog (or even “brand”, if you will) around this idea.

Secondly, I want to learn some new things while doing this. I’m very eager to learn how to make video courses (or at least: short video tutorials), so I might as well mix that in.

Finally, I have several other grand ideas too. There’s making a dedicated website for this material, organizing courses and workshops around this, as well as some even crazier ideas I’m not yet ready to talk about.

In Conclusion

Wrapping up, I mainly wanted to put my thoughts into writing. This already helped a lot in getting my ideas straightened out.

Next up will be some silence, while I build critical mass for an initial launch, something big enough to support new ideas and efforts. I’ll be sure to let you know when it’s ready!

EmpGrid Post-Mortem (and Resurrection!?)

This post is about my pet project EmpGrid. But first a short story about how I work.

About Promises

I try to be extremely precise with promises. In fact, you could say my Promises work very much like the native JavaScript promises. Let’s look at some code:

We can note some things about that code:

  • The promiseSome(thing) bit should always complete successfully, and always return a real, concrete promise;
  • After that, the then(...) bit will be executed nearly always, i.e. typically a promise resolves successfully;
  • Only in exceptional cases, with a typically well-defined error, will the catch(...) occur.

And that is exactly how real-life promises work for me too. I want people, most notably me, to be able to count on a promise being fulfilled (given normal circumstances).

So, why is this relevant to EmpGrid? Well, I promised myself this for 2018, Q3:

Finalize EmpGrid: finish it XOR do a postmortem

So, this post is that finalization. Because I promised!

EmpGrid Post-Mortem

Since I have not fully finished EmpGrid, the only option I have to fulfill my promise is to write a post-mortem about it.

The idea behind the project remains the same since its inception. It should be a self-contained web application that shows for a group of employees (colleagues) where you can find them online. Typically, you’d be interested in all their “Presences”, ranging from social media (Twitter, LinkedIn) to version control sites (GitHub, GitLab, etc.).

The intended side-effects of the project also remained the same. The repository should be slightly over-engineered, since it should serve as a playground to test technologies useful in larger applications. At least for the server-side bits; client-side a first version could do with a very simple monolithic single page application.

If you clone the repository today, you get a mostly functional thing. You should be able to run it out of the box (given you have a modern .NET Core development setup), and do some CRUD stuff. However, it is not at all “finished”. So, I’m declaring the project dead today, Saturday, September 22nd, 2018.

Here’s what’s still sorely missing:

  • Some kind of user management. Currently, you can only use the built-in seeded admin user with default password. You should at least be able to change the admin’s password. But really you should also be able to create logins linked to Emps, so that people could edit their own details.
  • Persistence. I really wanted to step outside the default EF Core solution that .NET Core pushes you towards. Instead, I would like to see some kind of open source document database integrated. Just haven’t gotten around to it.
  • Cloud-deploy-friendliness. It should be a breeze to deploy the app as a single unit, while still giving a clone-and-run developer experience. Possibly Docker would come in to play.

In addition, there’s a bunch of smaller to do items in the readme left. But the bottom line is: it’s not finished yet.

And that’s okay! Because I learned a lot while building it. Also, the truth is that I spent my time elsewhere past months. And that was a conscious (and wise) choice.

Still, I’m a little sad to declare EmpGrid dead before it got up and running somewhere…

Resurrection?!

So perhaps I should resurrect the project?

Well, not immediately. Or, more specifically, I will not promise that it’ll be resurrected. For one, I’m due for a long, relaxing vacation to Sri Lanka. And second, I see several other cool ventures coming up (hello Advent of Code!!). So perhaps EmpGrid will stay dead.

Or maybe I need a partner in crime?

Time will tell. For now: good bye, EmpGrid. Hibernation mode: on.

What have you done for me lately? (Sep 2018)

In December 2017 I wrote a similar post. Over the past few months I’ve been busy with other projects (writing and non-writing), and I’d love to aggregate them all here in a single post.

So, here’s some stuff from the recent past:

Some things that are in progress:

  • 2018-09-xx: Part 2 of the “Auth0 and Angular 6” blog series is forthcoming on infi.nl
  • 2018-09-xx: A status report on my EmpGrid project is in progress

Finally, some stuff I hope to be writing about in the near future:

Stay tuned!

Top 100 Games

I enjoy making “Top X” lists, and tempted by a friendly tweet some time ago I decided to make a “Top 100” list of games.

I’ve ordered these games from “Enjoyed the Most to ‘Least'”. How enjoyable a game was to me can depend on a variety of factors, most notably:

  • how much time I spend enjoying the game
  • how much it allowed me to enjoy it together with friends
  • how much the game “stood out” in story or gameplay

I’ve tried as best as I could to compensate for the recency effect and the fact that games from the 90’s are hard to compare to today’s triple-A titles. This is in fact so hard, that you might get a slightly different ordering depending on when you’d ask me. For that reason, everything below the first 20 or so games is in “unordered” buckets.

Let’s get to it!

Top 5! Best of the Best

1. [PC] Commander Keen

All episodes, but my preference would be in this order: 3, 2, 6, 1, 5, 4. (Keen Dreams sucked big time!)

Commander Keen 3

2. [PC] Warcraft 2

So. Many. Hours. Against the CPU. Against my dad. Against my friends. Plain fantastic game!

Warcraft 2

3. [N64] Golden Eye 64

Four-player-split-screen-james-bond-themed-3d-shooter. What more is there to add!?

Goldeneye 64

4. [PC] World of Warcraft

The sheer number of hours of gaming pleasure with this title makes it a must for the top 10. Favorite expansion would be Wrath of the Lich King, mostly because that’s when most of my real life friends also played the game (I’ve played Burning Legion through Cataclysm).

World of Warcraft

5. [N64] Zelda: Ocarina of Time

There’s something intrinsically great about this game that requires me to place it in a top 10.

Zelda: Ocarina of Time

6 – 20: Fantastic games

6. [PC] Dota 2

With over 1700 hours played to date it’s hard to deny that I’ve enjoyed this game. A lot.

Dota 2

7. [PC] Warcraft 3

Great gameplay, great story!

Warcraft 3

8. [PC] Prince of Persia (the original)

I still have dreams and nightmares about this game.

Prince of Persia

9. [Amiga] Impossible Mission

The first Amiga title on this list. Spending weekends at my uncle’s place just playing these games: amazing!

Impossible Mission

10. [PC] Neverwinter Nights 1

A very good game, but it’s on this list because I had some awesome full-weekend complete co-op playthroughs that were fantastic.

Neverwinter Nights

11. [AMIGA] Menace

R-Type clone I absolutely loved.

R-Type

12. [PC] Wolfenstein 3D

Didn’t even play this game that much, if it weren’t for playing the home brew levels my friends made for each other.

Wolfenstein 3D

13. [PC] Hollow Knight

Played only recently, but gameplay was brilliant.

Hollow Knight

14. [PC] Heroes of the Storm

Quit the game multiple times because it was too addictive, so it deserves its spot.

Heroes of the Storm

15. [PC] GTA: San Andreas

Really got me into the sandbox genre.

GTA San Andreas

16. [AMIGA] Klax

Tetris on steroids!

Klax

17. [WII] Super Mario Galaxy

The gameplay of this game was just phenomenal.

Super Mario Galaxy

18. [PC] Whacky Wheels

Yes, I enjoyed this more than Mario Kart. Bite me!

Whacky Wheels

19. [GAMEBOY] Tetris

Dragged me through some long trips that would’ve otherwise been quite tedious.

Tetris

20. [PC] Duke Nukem 3D

Hail to the king, baby! First co-op 3D shooter with amazing level design.

Duke Nukem 3D

21 – 50: Great Games

[N64] Super Mario 64
[AMIGA] Emerald Mine
[PC/RIFT] Robo Recall
[PC] Supaplex
[SNES] Super Mario World
[PC] Assassin’s Creed 1
[PC] Need for Speed – Hot Pursuit
[PC] Starcraft 2
[PC] Super Meat Boy
[PC] Amnesia: The Dark Descent
[PC] Mass Effect 2
[PC] Mark of the Ninja
[PC] Mother Goose
[PC] Trackmania Nations
[PC] Awesomenauts
[PC] Left 4 Dead 2
[PC] Civilization 4
[PC] Fallout 3
[PC] Batman: Arkham City (2011)
[PC] Portal 2
[PC] Portal 1
[N64] F-Zero X
[N64] Wipeout 64
[PC] Reunion
[PC] Guildwars 2
[PC] Call of Duty: Modern Warfare 2
[PC] Assassin’s Creed 2
[SNES] Super Mario Kart
[PC] Fighter’s Destiny
[PC] Fez

51 – 100: Good Games

[PC] Braid
[PC] Fallout New Vegas
[PC] Batman: Arkham Asylum (2009)
[PC] Centurion: Defender of Rome
[WII] Zelda: Twilight Princess
[PC] Jazz Jackrabbit
[GAMECUBE] Soul Calibur 2
[PC] Starcraft 1
[WII] WarioWare Smooth Moves
[PC] Doom 2
[NES] Super Mario 1
[GAMECUBE] Ikaruga
[PC] Quake 2
[PC] Quake 1
[PC] Company of Heroes 1
[PC] Arkanoid
[PC] Lemmings
[PC] Tony Hawk Pro Skater 2
[PC] Ski or Die
[PC] Secret Agent
[AMIGA] Marble Madness
[PC] Thomas was Alone
[PC] Sim City 2000
[PC] Batman: Arkham Origins (2013)
[GAMEBOY] Super Mario Land
[PC] Magicka
[SNES] Street Fighter 2
[PC] World of Goo
[SNES] Mortal Combat 2
[PC] Worms
[PC] Skyroads
[PC] One Must Fall 2097
[PC] Boppin
[PC] Divinity: Original Sin
[PC] Duke Nukem 2
[GAMEBOY] Super Mario Land 2
[PC] Killer Instinct Gold
[PC] Captain Comic
[PC] Volfied
[PC] Leisure Suit Larry 1
[N64] Diddy Kong Racing
[AMIGA] Firepower
[N64] Super Mario Kart 64
[PC] Mega Race
[PC] Prehistorik
[PC] Bioshock 2
[GAMEBOY] Bugs Bunny Crazy Castle
[PC] Quake 3 Arena
[PC] Sim City 1
[PC] Skyrim

Honorable Mentions

Here’s all the games I also quite enjoyed, but didn’t make the list on the particular day I decided to make it. But it’s a tough call vs some of the 51-100 games.

[PC] Descent
[SNES] NBA Jam
[PC] Dungeon Keeper 2
[PC] Settlers 2
[PC] The Incredible Machine
[PC] Battle Chess
[AMIGA] Micky Mouse
[PC] Hugo’s House of Horrors
[PC] Carmageddon
[PC] Tomb Raider (original)
[AMIGA] Pacman
[ATARI] COMBAT
[PC] Dragon Age 2
[PC] Flatout 2
[PC] Destruction Derby 2
[PC] Age of Empires
[PC] Limbo
[PC] Paladins
[PC] Guns of Icarus
[PC] Primal Rage
[PC] Skunny Kart
[GAMEBOY] Pipe Dream
[PC] Doom 1
[PC] Dragon Age 3 Inquisition
[PC] MDK 1
[PC] Bastion
[PC] Preshistorik 2
[PC] Prince of Persia 2
[PC] Duke Nukem 1
[PC] Virtual Karts
[PC] VVVVVV
[NES] Double Dragon 2
[PC] Warcraft 1
[PC] California Games
[PC] Unreal Tournament
[PC] Resident Evil 7
[PC] Trackmania 2

And that’s it! So, tell me about your favorite games now…

Initial Oculus Rift Top 5

My employer (Infi) has a tradition of providing three (somewhat ridiculous) options for Christmas gifts. One of them this year was the Oculus Rift with Touch Controllers. Given that I just built a fresh PC with a GTX-1080 earlier this year, the choice was obvious for me.

After two weeks of playing around with it, buying quite a few games, and trying out most of them, I certainly have my favorites. I’m curious to see how this will evolve. So time to log my current, initial Top 5 Games for Oculus Rift.

Disclaimer: I get VR-sickness quite quickly, so for obvious reasons stuff like TrackMania and Eve Valkyrie (though great games) are currently out for me.

  1. Robo Recall! It was a hard choice between 1 and 2, but replay value (achievements and high scores) sealed the deal. What a brilliant game! Bonus points for the funny story line, option to have multiple saves, diversity of levels, and brilliant sound effects and music.
  2. SuperHot VR. You feel like a BOSS in this game, a true action movie hero! Played through the entire game in one weekend (maybe 4-6 hours), which is both a good thing and a bad thing. Also a slightly more polished meta-interface (save games so I can let others replay a separate playthrough, for one) would’ve been nice.
  3. The ClimbThis borders on giving me VR sickness, yet I come back to this game every time. So that must be good?
  4. I Expect You to DieI’ve only done the first level so far, but I’m kind of “saving” the rest of this game for when I really feel like it.
  5. Arizona Sunshine. A toss up with Killing Floor: Incursion, but I think the zombie shooter should be the last item in my top 5. And I haven’t even tried coop mode yet.

Honorable mentions should go to The Invisible Hours (which my wife loves, and I can see why), Dark Days for giving me a good few scares already, and Oculus First Contact for being a fantastic tutorial.

My Backlog (games I haven’t played enough yet to judge) include Lucky’s Tale and Echo Arena so perhaps they’ll show up on a next installment of this blog series. I’m also looking forward to trying Lone Echo at some point, though I’m afraid I’ll get a heavy case of VR Sickness from it.

Oh, and the absolute worst? Well, Nature Treks VR was something my wife tried, but we asked for a refund just because it was so bad. In addition, not really “bad” but more “no experience at all”: Resident Evil 7 for PC doesn’t seem to support the Oculus Rift. Finally I’m very disappointed that Dota 2 has no good support for the Oculus controllers, making it unusable.

See you in about a year? Hopefully I’ll have a meaningful update by then!

What have you done for me lately? (Dec 2017)

Don’t worry! This is not yet another “I haven’t blogged because…” post. It’s similar though: it’s to aggregate all the cool material I’ve been producing lately instead of writing on my personal blog!

Without further ado, here’s the goodies from the past:

Oh yeah, there’s also some stuff going on in the present:

Finally, here’s some cool stuff I might possibly do in the near future:

  • I’ve worked several months in a Java/Maven/Spring project, which lures me to write about the differences between Java and C#. No promises though!
  • 2018-01-xx: excited to start on a project that uses ClojureScript, hurray for opportunities to learn stuff! Might even write about it, here or on Infi’s blog.
  • 2018-xx-xx: blog post about Top 100 Games is in progress, though no promises here!
  • 2018-xx-xx: a side project that gets sidelined a lot, I still intend to finish EmpGrid some time.

Several things on these lists I want to blog about, but I’m not ready to commit to anything yet. Stay tuned!

Getting to the .NET Core of Things

Getting to the .NET Core of things

This post aims to help developers from other tech stacks get up to speed with .NET Core. It should be enough to follow further discussions of this tech stack, as well as help you decide whether it’s something you might want to investigate further.

Introduction

Microsoft’s tech stack (for various types of applications) has been .NET for over 15 years now. For the most part of those 15 years, Microsoft has exclusively focussed on proprietary, Windows-only software. In recent years Microsoft shifted to open source and cross platform solutions in many areas, including .NET. With that, the newest incarnation of .NET is .NET Core, which is completely open source and available across various platforms.

This post explains the state of Microsoft’s tech stack, from the perspective of this new “.NET Core”.

Note on versions: this post was written when .NET Standard 2.0 and .NET Core 2.0 have just come out. Most information also holds for earlier versions, but unless specified otherwise, all text and code below assume version 2.0 to be the context.

How .NET traditionally worked

Let’s first investigate how .NET in general works, with the pre-.NET Core context in mind.

As a developer, you can write some C# or VB.NET code. When you compile this code, you’ll get IL (Intermediate Language), which is bytecode. This bytecode is packaged in DLL and possibly EXE files, which runs on any computer. Well, technically it runs on any computer… that has the .NET Framework to run it. Remember, talking pre-.NET Core here, so this “any computer” has to be a Windows machine with the proper version of the .NET Framework.

The part of .NET that actually runs the application is the CLR (Common Language Runtime). Included with the CLR is a GC (Garbage Collector) and other memory management tools. Another important part of the .NET Framework is the BCL (Base Class Library) which contains essential base libs, for example for collections, IO, XML handling, etc.

In addition, .NET itself also used to ship with application frameworks. For example frameworks for desktop applications (WinForms and WPF), as well as web application frameworks (ASP.NET). This has changed in recent years. Now, almost all application frameworks (including ASP.NET MVC), are distributed as packages. This is done using the NuGet package manager, where application frameworks live as siblings to other libraries and SDKs. Note that Microsoft’s packages sit there along the third party packages.

And that’s all the basics for building .NET applications traditionally. With that out of the way, let’s move on to the interesting bits.

Terminology

The best way to start explaining about the “new” .NET situation is by building a glossary.

Terminology around .NET Core has been very confusing for quite some time. But since around halfway through 2017 it seems things are coming together. I’ve left all obsolete terms (Hello there, “DNX”!) for an appendix at the end, and will first focus on current terminology. Here’s a quick overview of the important terms.

Let’s start with the most important thing, which is in my opinion is not “.NET Core”. It is .NET Standard, which simply specifies an API. It lists all the (namespaced) types and methods you should implement to create a .NET Implementation (sometimes also referred to as a “.NET Framework” or a “.NET Platform”).

So what .NET Implementations are there then? Several! First, the most well-known one is the .NET Framework, which is available only for Windows.

Second, the .NET Framework framework has been ported, and this (cross platform) port is known as the Mono framework. Today Mono is not only port, but it is in fact also explicitly a .NET Implementation by implementing the .NET Standard officially.

Third, there’s Xamarin. Now there is a company named “Xamarin” (now owned by Microsoft), which develops similarly named platforms: Xamarin.iOS and Xamarin.Android.
These are both versions of Mono for their respective mobile platforms. Recent and upcoming versions of Xamarin.iOS and Xamarin.Android will be .NET Implementations that conform to the .NET Standard too.

Fourth and finally, let’s get to the main topic: .NET Core. This is a cross-platform .NET Implementation by Microsoft, conforming to the .NET Standard. Moreover, it’s completely open source, with most parts using the permissive MIT license.

Basically Microsoft re-implemented the Windows-only .NET Framework in the cross-platform .NET Core, where overlap between the two is specified by .NET Standard. Note that large parts of .NET Core are forked from the .NET Framework.

Within .NET Core there are two other important terms. First, CoreCLR is the Common Language Runtime (CLR) of .NET Core. This is the part that runs your .NET Core applications, takes care of memory management, etc. Second, CoreFX is the Base Class Library (BCL) of .NET Core. It contains the basic types such as those around collections, IO, xml handling, etc. All of these bits and pieces are available cross-platform.

With those terms laid out, let’s dive into the details.

.NET Standard

The .NET Standard API specification for .NET Implementations has different versions. The code and documentation can be found on GitHub, which also shows which implementations conform to each version of .NET Standard. Here’s a trimmed down version of the current overview:

.NET Standard versions

For example, from the above you can tell that .NET Core 1.0 implements .NET Standard 1.0 through 1.6. And as another example, .NET Standard 2.0 is implemented by both .NET Core 2.0 and .NET Framework (the Windows-only one) 4.6.1.

You can easily check what’s in a specific version by checking the markdown-based docs for all versions. It includes “diff” files showing what changed since the previous version. For example, this API was added to .NET Standard going from 1.6 to 2.0:

Now for the important part! When writing .NET code, you can choose what your intended target (“Target Framework“) is. But this does not need to be a .NET Implementation.
You can also target .NET Standard!

But “Why would you target a spec, which cannot run anything?”, you might ask. The main reason to do that would be when you’re writing some kind of library.

For example, suppose you’re targeting .NET Standard 2.0 with your hip new FooBar library. By using .NET Standard as a Target Framework you’re basically saying: anyone running an app on a .NET Implementation supporting .NET Standard 2.0 can use my library.

Now suppose you are a library or framework author who publishes things on NuGet. You then have to specify what Target Framework your code’s compatible with. So from NuGet we can extract interesting statistics, and see that the community is really getting on the .NET Standard bandwagon. Most popular libraries already support even from some 1.x version onward of .NET Standard (usually 1.3 or 1.6).

In addition to explicit Framework targeting, there’s something specific for .NET Standard 2.0. A “compatability shim” was also rolled out in the tooling around packages, meaning you can use any library that is de facto API-compatible with .NET Standard 2.0. Even if the author didn’t explicitly declare it to be compatible. And although this might seem dangerous, it works pretty well in practice, allowing for application authors to switch more quickly to .NET Core if they want to.

.NET Core

This is where things get cross-platform! You can download the SDK for Windows, various Linux distributions (e.g. RHEL, Ubuntu and Mint, SUSE), and Mac OSX. The SDK contains both the CoreCLR (runtime) to run applications, as well as the tools needed to create and build applications.

After installing you can use the command line interface to test everything is working. Just:

  1. Create a folder “hellow” and cd into it;
  2. Execute dotnet new console, which generates:
  3. Execute dotnet run;

And you should see the traditional “Hello World!” greeting now.

To move beyond the CLI to using an IDE for development, there are several choices.

  • Visual Studio is still probably the best experience on a Windows machine.
  • VS Code is available on Windows, Mac, and Linux, offering a pretty light-weight IDE.
  • JetBrains Rider is an Intellij-like IDE for .NET development, available on Windows, Mac, and Linux.

Any code you compile, on any OS, with any IDE, should be runnable on .NET Core on other OSes. If .NET Core is installed on that OS.

You can also create “self-contained applications”: applications that include the .NET Core runtime as well. Obviously, then you need to specify the platform to target because .NET Core binaries are platform-specific. You do this by publishing with a Runtime Identifier (RID) like “win10-x64“, or “osx.10.12-x64“, or “linux-x64“. This will compile your .NET Core application and bundle it with the appropriate version of .NET Core itself.

And that’s really all there is to it. From here on out it’s all about writing code in a .NET language of your choice. This means C# or F#, or VB.NET in the near future.

Wrapping Up

Microsoft is changing up their game. Although the traditional .NET Framework is here to stay, the new .NET Core framework is the future. They are both .NET Implementations and yes: they have overlap (as defined by .NET Standard). But you can safely bet on the fact that .NET Core and .NET Standard are going to get focus forward.

Given that all these efforts are both open source and cross-platform, riding along that train seems like an excellent idea. Especially if you’re currently using another tech stack, but interested in the .NET ecosystem, now is a great time to hop on and join for the ride!

Just give it a go!

~

This post formed the backbone of my talk at DomCode 2017-08-29. By and large it can be considered a transcript of that talk. If you want you can also download the slides of my presentation.


Appendix A: Bonus Topics

There are plenty more in-depth and advanced topics. Here’s a quick list of particularly interesting ones you could further pursue:

  • Docker and .NET Core go very well together. The official docs on that should be a good starting point.
  • EF Core (Entity Framework Core) gets a lot of attention too. EF is Microsoft’s ORM framework, and it has its own dedicated (sub)site with more info.
  • UWP (Universal Windows Platform) for creating Windows Store apps that could be cross platform (including things like Xbox, Windows Phone, HoloLens, etc) will also likely conform to .NET Standard. Check the main UWP docs for further info.
  • Roslyn is the code name for the open-source compilers for .NET languages. The best starting point for more details is the Roslyn Github repo.
  • .NET Native will allow you to compile your .NET code not to IL (bytecode), but to platform-specific native code. Check the official docs for more info.

Appendix B: Obsolete Terminology

Here’s a short list of (currently) prominent terms that I consider to be obsolete, along with their definition (and the source of that definition, if applicable).

  • DNX (Dotnet Execution Runtime), DNVM (script for obtaining DNX) and DNU (Dotnet Developer Utility) were part of older Release Candidates of .NET Core. The features have mostly been moved to the .NET Core CLI. See the Microsoft docs for more info.
  • project.json was meant to be the new project system, but instead Microsoft decided to move back to csproj files with some new features. Read more on these Microsoft docs pages.
  • PCL (Portable Class Library) was an earlier attempt to help library authors create code that could be reused across various fameworks and platforms. The best reference I could find is these docs from Microsoft. In light of .NET Core you can easily forget about it though, unless you need to convert a PCL project to .NET Core.
  • vNext (which at some point was also called ASP.NET 5) can best be seen as a working title of the next .NET Framework version (the one for Windows only), but has been dropped entirely. About the only semi-sensible reference left is on Stack Overflow.
  • ASP Classic is not really an obsolete term, but rather obsolete technology. The latest stable release was from around the year 2000. It has nothing to do with .NET or the various ASP.NET application frameworks. Wikipedia has a quick history recap if you want it.

References