Posts in Xml (4 found)
underlap 2 months ago

Blogging in markdown

I recently switched my blog to eleventy and so posting now consists of editing a markdown file and regenerating the site. There are several benefits: This post discusses markdown footnotes, the YAML preamble used in eleventy markdown files, how best to configure markdown in eleventy, and how I use Code OSS to edit markdown. But first I’d like to reflect on the advantages of markup languages, of which markdown is one. WYSIWYG (What You See Is What You Get) has sometimes been described as WYSIAYG (What You See Is All You’ve Got). In other words, the content doesn’t necessarily imply the logical structure of a document. Being able to see the structure of a document makes it more readable. Also, I’ve seen Microsoft Word documents that would make your toes curl: [1] the author used arbitrary formatting to achieve what, in their opinion, looked good. But in doing so, they failed to provide a logical structure. I have used WYSIWYG editors, such as Microsoft Word and OpenOffice/LibreOffice, but I tend to spend too long fiddling with the formatting, which is a distraction from writing the content. Also, I have experienced situations where a document gets corrupted and cannot be opened. This is more likely with WYSIWYG editors which store documents in a binary format. Therefore I much prefer markup languages over WYSIWYG. The structure of a document is clearer and there’s less need to pay attention to formatting while writing. I’ve used various markup languages over the years: GML, SGML [2] (briefly), HTML, LaTeX, and markdown. I really like LaTeX, especially when mathematics is involved, but markdown has the advantage that the source is more readable. The authors of RFC 9535, of which I was one, used markdown [3] , so it’s even suitable for writing technical documents. That said, let’s look at one of the main benefits of moving my blog to eleventy. The beauty of using markdown footnotes is that they are numbered and sorted automatically. Using a meaningful name for a footnote rather than a number makes it easier to keep track of which footnote goes with which reference. Here’s an example of the syntax: With manual numbering, adding a footnote in the middle of the sequence was awkward and error prone. Also, the footnotes can be kept near to where they are referenced, rather than having to be put at the bottom of the file. I installed a footnotes plugin for , [4] to use markdown footnotes in eleventy. So much for one of the main benefits of using markdown for blog posts. On the other hand, an unfamiliar feature was forced on me by the eleventy base blog: each markdown post has to start with a preamble written in YAML. A preamble seems like a reasonable place to store metadata for a post. For example, this post’s preamble is: I’m still getting used to listing tags in the preamble. WriteFreely used to render hashtags automatically, which was more in line with the philosophy of markdown. Also, it would be more natural to use a top-level heading at the start of a post to indicate the title. The default configuration of eleventy markdown isn’t ideal. Here’s my configuration: ensures semantic line breaks. Then breaking a paragraph across multiple lines is not reflected in the rendered version. So it’s possible to put each sentence on its own line. This makes for better readability of the markdown and better diffs. [5] If you need persuading of the advantages of this, see Semantic Linefeeds , which includes the quote below, Semantic line breaks are a feature of Markdown, not a bug , and Semantic Line Breaks . Hints for Preparing Documents Most documents go through several versions (always more than you expected) before they are finally finished. Accordingly, you should do whatever possible to make the job of changing them easy. First, when you do the purely mechanical operations of typing, type so subsequent editing will be easy. Start each sentence on a new line. Make lines short, and break lines at natural places, such as after commas and semicolons, rather than randomly. Since most people change documents by rewriting phrases and adding, deleting and rearranging sentences, these precautions simplify any editing you have to do later. — Brian W. Kernighan, 1974 ensures proper quote marks are used and various constructs are replaced by symbols, e.g.: I’m using Code OSS (the open source variant of VSCode) for editing. Yeah, I know: it’s not Emacs or vi. But it does have a ton of useful features and plugins which work out of the box. In addition to the built-in markdown editing and preview support in Code OSS, I installed the following plugins: [6] Markdown Footnote - renders footnotes correctly in the preview. Markdown yaml Preamble - displays the preamble at the start of the preview. [7] For example, the preamble of this post renders in the preview as: Markdown lint - helps enforce a standard style for writing markdown. I’m pretty happy writing blog posts as plain markdown files. There are many more advantages than disadvantages. Let’s see if my opinion is the same in six months’ time. The most egregious examples have been students’ assignments, but others have come close. ↩︎ This was with Framemaker. I can’t remember whether the markup was actually SGML or XML. ↩︎ We actually used kramdown, a dialect of markdown geared towards writing IETF specifications. ↩︎ The markdown support used by eleventy base blog. ↩︎ particularly if you use ↩︎ I used arch’s code marketplace to install plugins from the VSCode marketplace. This seems legit if I restrict myself to plugins with an OSS license. After all, I could have downloaded the source of each plugin and installed it in Code OSS. ↩︎ Having the table in the preview at least means the title features somewhere. But I’d prefer the plugin to render the title as a heading, so I suggested this in an issue . ↩︎

1 views
Playtank 6 months ago

My Game Engine Journey

There, but certainly not back again. It’s sometime around the late 1980s/early 1990s that some developers start talking about a “game engine” as a thing. Maybe not even using the term “engine” yet, but in the form of C/C++ libraries that can be linked or compiled into your project to provide you with ready-made solutions for problems. Color rendering for a particular screen, perhaps, or handling the input from a third-party joystick you want to support. The two Worlds of Ultima games are built on the Ultima VI: The False Prophet engine, as a decently early example. When you put a bundle of these ready-made solutions together, it becomes an engine . In those days, the beating heart would usually be a bespoke renderer. Software that transforms data into moving pictures and handles the instruction set of whichever hardware it’s expected to run on. What id Software perhaps revolutionised, if you are to believe John Romero in his autobiography Doom Guy: Life in First Person (an amazing book), was to make developer tools part of this process. To push for a more data-driven approach where the engine was simply the black box that you’d feed your levels and weapons and graphics into. This is how we usually look at engines today: as an editor that you put data into and that makes a game happen. To give some context for this, I thought I’d summarise my personal software journey. One stumbling step at a time, and not all of it strictly engines . When I grew up in the 80s/90s, I was often told that programming was simply too hard for Average Joe poor kids like myself. You had to be a maths genius and you had to have an IQ bordering on Einstein’s. At a minimum, you needed academic parents. If you had none of those, programming wasn’t for you. Sorry. This is the mindset I adopted and it affected my early days of dabbling profoundly. Where I lived, in rural Sweden, there were no programmer role models to look up to, and there was no Internet brimming with tutorials and motivation either. Not yet. We didn’t have a local store with game-stocked shelves or even ready access to computers at school. Again, not yet. But eventually, maybe around the age of 10 or so, I ran into QBASIC on the first home PC that was left over from my dad when he upgraded. Changing some values in the game Gorillas to see what happened was my introduction to programming in its most primitive form. Ultimately, I made some very simple goto-based text adventures and even an attempt at an action game or two, but I didn’t have enough context and no learning resources to speak of, so in many ways this first attempt at dabbling was a deadend. It’s clear to me today, looking back, that I always wanted to make games, and that I would probably have discovered programming earlier if I had been introduced to it properly. Even if I felt programming was too complicated, I did pull some games apart and attempt to change things under the hood. One way you could do this was by using a hex editor (hex for hexadecimal ) to manipulate local files. This is something you can still use for many fun things, but back then hexadecimal was part of how games were packaged on disk. (Maybe it still is and I’m letting my ignorance show.) The image below is from Ultima VII: The Black Gate seen through a modern (free) hex editor called HxD . As you can see, it shows how content is mapped in the game’s files. Back then, my friends and I would do things like replace “ghoul” in Ultima VIII with “zombi” (because it has to be the same number of letters), or even attempt to translate some things to Swedish for some reason. (To be fair, the Swedish translation of Dungeon Keeper 2 is in every way superior to the English simply because of how hilariously cheesy it is.) To grab this screenshot I could still find the file from memory, demonstrating just how spongy and powerful a kid’s brain really is… With Duke Nukem 3D , and to a lesser extent DOOM , I discovered level editors. The Build Engine, powering the former, was a place where I spent countless hours. Some of the levels I made, I played with friends. I particularly remember a church level I built that had sneaky pig cops around corners, and how satisfying it was to see my friends get killed when they turned those corners. How this engine mapped script messages to an unsigned byte, and memorising those tiny message codes and what they meant, were things I studied quite deeply at the time. I fondly remember a big level I downloaded at some point (via 28.8 modem I think) that was some kind of tribute level built to resemble Pompeii at the eruption of Vesuvius. It’s a powerful memory, and I’m quite actively not looking to find that level to get to keep the memory of it instead. The fact that walls couldn’t overlap because it wasn’t actually a 3D engine were some of the first stumbling steps I took towards seeing how the sausage gets made. Several years after playing around with the Build Editor, I discovered WorldCraft. I built a church here too for some reason, despite being a filthy secular Swede, and tried to work it into a level for the brilliant Wasteland Half-Life mod. This was much harder to do, since it was fully 3D, and you ran into the limitations of the day. The engine could only render about 30,000 polygons at a time, meaning that sightline optimisations and various types of load portals were necessary. Things I learned, but struggled with anyway. Mostly because Internet was still not that great as a resource. Had I been smarter, I would’ve started hanging around in level design forums. But level design never stuck with me the way programming eventually would. During this time, I also learned a little about tools like 3D Studio Max, but as with programming in the past I thought you had to be much better than I was to actually work on anything. My tip to anyone who is starting out in game development: don’t bring yourself down before you even get started. It can deal lasting damage to your confidence. During the late 90s and early 2000s, something came along that finally “allowed me” to make games, at least in my head. At first it was DarkBASIC , which is a BASIC version with added 3D rendering capabilities produced at the time by the British company The Game Creators. This discovery was amazing. Suddenly I was building little games and learning how to do things I had only dreamed of in the past. None of it was ever finished, and I always felt like I wasn’t as good as people from the online communities. It’s pretty cool, however, that Rami Ismail hung out in these forums and that I may even have competed against him in a text adventure competition once. Along the way, I did learn to finish projects however. I made two or three text adventures using the DarkBASIC sequel, DarkBASIC Professional, and even won a text adventure competition all the way back in 2006 with a little game I called The Melody Machine . In 2005 I enrolled in a game development education in the town of Falun, Sweden, called Playground Squad. It was the first year that they held their expanded two-year vocational education for aspiring game designers, programmers, and artists. My choice was game design, since I didn’t feel comfortable with art or code. This was a great learning experience, particularly meeting likeminded individuals, some who are still good friends today. It’s also when I started learning properly how the sausage gets made, and got to use things like RenderWare Studio. An early variant of an editor-focused game engine, where designers, programmers, and artists could cooperate more directly to build out and test games. It was never a hit the way Unity or the UDK would become, but I remember it as being quite intuitive and fun to play around with. We made one project in it, that was a horde shooter thing. I made the 3D models for it in Maya, which isn’t something I’ve done much since. I don’t remember what SimBin called their engine, but I got to work in two different iterations of it in my first real work at a game studio, as an intern starting in 2006. One engine was made for the older games, like RACE: The WTCC Game that became my first published credit . The other was deployed on consoles and was intended to be the next-generation version of SimBin technology. There I got to work on particle effects and other things, that were all scripted through Lua or XML if I recall correctly. Writing bugs in bug tools while performing light QA duties. To be honest, I’m not sure SimBin knew what they needed any designers for. But I was happy to get my foot in the door. My best lesson from SimBin was how focused it was on the types of experiences they wanted. They could track the heat on individual brakes, the effects of the slipstream behind a car in front of you, and much more. They also focused their polygon budget on the rear of cars, since that’s the part that you see the most. You typically only see the front of a game model car in the mirror, rendered much smaller than you see the rear of the car in front of you. This is an example I still use when talking about where to put your focus: consider what the player actually sees the most. I did work with the visual scripting tool Kismet (precursor to Blueprint) and Unreal’s then-intermediary scripting language UnrealScript in my spare time in 2006, for a year or so. It had so many strange quirks to it that I just never got into it properly. First of all, Unreal at the time used a subtractive approach to level editing unlike the additive approach that everyone else was using, which meant that level design took some getting used to. With BSP-based rendering engines, the additive norm meant that you had an infinite void where you added brushes (like cubes, stairs, cones, etc.) and that was your level. In the UDK, the subtractive approach meant that you instead had a filled space (like being underground) where you subtracted brushes to make your level. The results could be the same, and maybe hardcore level designers can tell me why one is better than the other, but for me it just felt inconvenient. Never got into UDK properly, because I always felt like you had to jump through hoops to get Unreal to do what you wanted it to. With Kismet strictly tied to levels (like a Level Blueprint today), making anything interesting was also quite messy, and you had to strictly adhere to Unreal’s structure. My longest stint at a single company, still to this day, was with Starbreeze. This is the pre- Payday 2 Starbreeze that made The Chronicles of Riddick: Escape from Butcher Bay and The Darkness . The reason I wanted to go there was the first game, the best movie tie-in I had ever played. A game that really blew my mind when I played it with its clever hybridisation of action, adventure, and story. Starbreeze was very good at making a particular kind of game. Highly cinematic elements mixed with first-person shooter. If this makes you think of the more recent Indiana Jones and The Great Circle , that’s because Machinegames was founded by some of the same people. Starbreeze Engine was interesting to work with, with one foot firmly in the brushy BSP shenanigans of the 90s (additive, thankfully), and the other trying to push forward into the future. Its philosophies, including how to render a fully animated character for the player in a first-person game, and how scripting was primarily “target-based” in the words of the original Ogier programmer, Jens Andersson, are things I still carry with me. But as far as the work goes, I’m happy that we don’t have to recompile our games for 20 minutes after moving a spawnpoint in a level anymore. (Or for 10-24 hours to bake production lighting…) During my time at Starbreeze, I finally discovered programming and C++ and learned how to start debugging the Starbreeze Engine. Something that made my job (gameplay scripting) a lot easier and finally introduced me to programming in a more concrete way at the ripe age of 26. At first, I tried to use the DarkBASIC-derived DarkGDK to build games in my spare time, since I understood the terminology and conventions, but soon enough I found another engine to use that felt more full-featured. It was called Nuclear Fusion. It was made by the American one-man company Nuclear Glory Entertainment Arts, and I spent some money supporting them during that time. Now they seem to be gone off the face of the Earth unfortunately, but I did recently discover some of the older versions of the software on a private laptop from those years. As far as projects go, I never finished anything in this engine, but I ended up writing the official XInput plugin for some reason. Probably the only thing I ever wrote in plain C++ to be published in any form. Having built many throwaway prototypes by this time, but never quite finished anything, I was still looking for that piece of technology that could bridge the gap between my lower understanding of programming and that coveted finished game project I wanted to make. At this point, I’m almost six years into my career as a game developer and my title is Gameplay Designer. It’s in 2011-2012 that I discover Unity. On my lunch break and on weekends, I played around with it, and it’s probably the fastest results I’ve ever had in any game engine. The GameObject/Component relationship was the most intuitive thing I had ever seen, and my limited programming experience was an almost perfect match for what Unity required me to know. Unity became my first introduction to teaching, as well, with some opportunities at game schools in Stockholm that came about because a school founder happened to be at the Starbreeze office on the lunch break one day and saw Unity over my shoulder. “Hey, could you teach that to students?” All of two weeks into using it, my gut response was “yes,” before my rational brain could catch up. But it turned out I just needed to know more than the students, and I had a few more weekends to prepare before course start. Teaching is something I’ve done off and on ever since—not just Unity—and something I love doing. Some of my students have gone on to have brilliant careers all across the globe, despite having the misfortune of getting me as their teacher at some point. Since 2011, I’ve worked at four different companies using Unity professionally, and I have been both designer and programmer at different points in time, sometimes simultaneously. It’s probably the engine I’m most comfortable using, still to this day, after having been part of everything from gameplay through cross-platform deployment to hardware integration and tools development in it. You can refer to Unity as a “frontier engine,” meaning that it’s early to adopt new technologies and its structure lends itself very well to adaptation. You set it up to build a “player” for the new target platform, and you’re set. Today it’s more fragmented than it used to be, with multiple different solutions to the same problems, some of which are mutually exclusive. If you ask me if I think it’s the best engine, my answer would be no, but I’ll be revisiting its strengths and weaknesses in a different post. The same person who pulled me in to teach Unity also introduced me to Unreal Engine 4 in the runup to its launch. I was offered an opportunity to help out on some projects, and though I accepted, I didn’t end up doing much work. It coincided with the birth of my first child (in 2013) and therefore didn’t work out as intended. I’ve still used Unreal Engine 4 quite a bit, including working on prototypes at a startup and teaching it to students. It’s a huge leap forward compared to the UDK, maybe primarily in the form of Blueprint. Blueprint is the UE4 generation of Kismet and introduced the object form of Blueprints that you’d be used to today. Rather than locking the visual scripting to a level, Blueprints can be objects with inheritance. They are C++ behind the scenes and the engine can handle them easily and efficiently using all the performance tricks Unreal is known for. Funnily enough, if you came from UDK, you can still find many of the Kismet helper classes and various UnrealScript template shenanigans are still there in Blueprint and Unreal C++ but wrapped into helper libraries. It’s clearly an engine with a lot of legacy, and the more of it you know before starting the better. Autodesk Stingray is an engine that was developed from the Swedish BitSquid engine after Autodesk purchased it and repurposed it for their own grand game engine schemes. BitSquid was a company founded by some of the developers that once made the Diesel engine, that was used at the long-since defunct Grin and later Starbreeze-merged Overkill game studios. When I worked with it, Autodesk had discontinued the engine, but three studios were still using it and supporting it with internal engine teams. Those three were Arrowhead, Fatshark, and Toadman. I worked at Toadman, as Design Director. As far as engines go, Stingray has some really interesting ideas. Two things struck me, specifically. The first is that everything in the engine is treated essentially as a plugin, making it incredibly modular. Animation tool? Plugin. Scripting VM? Plugin. The idea of a lightweight engine with high extensibility is solid. Not sure it was ever used that much in practice, but the intention is good. Another thing I bring with me from those days isn’t strictly about Stingray, but about a fantastic data management tool called Hercules that Toadman used. It allowed you to make bulk changes to data, say doubling the damage of all weapons with a single command, and was an amazing tool for a system designer. It decoupled the data from the game client in ways that are still inspiring me to this day. Sadly, since earlier this year (2025), Toadman is no longer around. The jump between Unreal Engine 4 and Unreal Engine 5 is not huge in terms of what the engine is capable of, even if Epic certainly wants you to think so (Lumen and Nanite come to mind). But there is one big difference and that’s the editor itself. The UE5 editor is much more extensible and powerful than its older sibling, and is both visually and functionally a complete overhaul. There’s also a distribution of Unreal Engine 5 called Unreal Editor for Fortnite that uses its own custom scripting language called Verse, that is said to eventually be merged into the bigger engine. But I simply have no experience with that side of things. My amateur level designing days are long-since over. Probably the biggest change between UDK and UE5 is that the latter wants to be a more generic engine. Something that can power any game you want to make. But in reality, the engine’s high end nature means that it’s tricky to use it for more lightweight projects on weaker hardware, and the legacy of Unreal Tournament still lives on in the engine’s core architecture and workflows. As with Unity, I don’t think it’s the best engine. But I’ll get into what I consider its strengths and weaknesses in a future post. I’ve spent years working with UDK, UE4, and UE5, in large teams and small, but haven’t released any games with them thus far. Projects have been defunded, cancelled, or otherwise simply didn’t release for whatever reason. Imagine that you release a new update for your game every week , and you’ve been doing so consistently since 2013. This is Star Stable Online—a technical marvel simply for the absolutely insane amounts of data it handles. Not to mention the constant strain on pipelines when they’re always in motion. My biggest takeaway from working alongside this engine last year (2024) is its brilliant snapshot feature that allows you to to save the game’s state at any moment and replay from that moment whenever you like. Even shared with other developers. This approach saves a ton of time and provides good grounds for testing the tens of thousands of quests that the game has in store for you after its 14 years (and counting) life span. You may look at its graphics and think, “why don’t they build this in Unreal?”, but let’s just say that Unreal isn’t an engine built to handle such amounts of data. The visuals may improve, but porting it over would be a much larger undertaking than merely switching out the rendering. Can’t really talk about it. It’s Stingray, but at Arrowhead, and it powers Helldivers 2 . Like the engine’s Fatshark and Toadman variants, it has some features and pipelines that are unique to Arrowhead. I hope I get to play around with even more engines than I already have. They all teach you something and expand your mind around how game development can be done. At the end of the day, it doesn’t matter which engine you use, and it’s not often that you can make that decision yourself anyway if you’re not footing the bill. Like an old colleague phrased it, “there’s no engine worse than the one you’re using right now.” Fortunately, there’s also no better engine for getting the work done. QBASIC (around ’89 or ’90?) Hex Editing (early 90s) Build Engine (early-mid 90s) WorldCraft/Hammer (late 90s, early 00s) DarkBASIC/DarkBASIC Pro (late 90s, early 00s) RenderWare Studio (’05 or ’06) SimBin Engine (’06) First professional work. UDK (’05-’07) Starbreeze Engine (’07-’12) DarkGDK/Nuclear Fusion (’09-’12) Unity (’12-today) Toadman Stingray (’17-’20) UE4 (’14-’21, sporadically) UE5 (’21-today) Star Stable Engine (2024) Arrowhead Stingray (2025-?)

0 views
Dizzy Zone 1 years ago

Enums in Go

I’ve seen many discussions about whether Go should add enum support to the language. I’m not going to bother arguing for or against but instead show how to make do with what we have in the language now. Enumerated types, or enums, represent finite sets of named values. They are usually introduced to signal that variables can only take one of the predefined values for the enum. For example, we could have an enum called with members , , . Usually, the members are represented as integer values, starting from zero. In this case, would correspond to 0, to 1 and to 2 with , and being the names of corresponding members. They help simplify the code as they are self-documenting and explicitly list all possible values for the given type. In many languages enums will also return compile errors if you’ll try to assign to an invalid value. However, since enums do not exist in Go, we do not have such guarantees. Usually, one of the first steps of defining an enum in Go is to define a custom type for it. There are 2 commonly used types for this purpose, and . Let’s start with strings: I actively avoid defining my enums in this style, since using strings increases the likelihood of errors. You don’t really know if the enum members are defined in uppercase, lowercase, title case or something else entirely. Besides, there is a high chance of miss-spelling the strings both in the definition and subsequent use and becomes . I also often use bitmasks so that might influence my judgement, but I’ll talk about bitmasks in a separate post at some point. For these reasons I prefer an int declaration: Keep in mind that this is highly subjective and people will have their own preferences. Also, the int definition does not read as nicely when displayed, but we will fix that later. In the colors example, I’m only using three colors, but what if we had 5, 10 or 20 of them? It would be quite tedious to assign values to each and every single one of them. Luckily, we can simplify this by using the keyword that Go provides: Iota acts as syntactic sugar, automatically incrementing the value for each successive integer constant in a constant declaration. If we’d like to start at another number, we can achieve this with the following: You can also use a variety of expressions on iota but I hardly recommend that except for the most trivial of cases as this leads to code that is hard to read and comprehend. One of the common use cases of such expressions which is still readable is defining bitmasks: For more on iota, please refer to the Go spec . One thing to note is that you should be very careful when making changes to already established constant declarations with iota. It’s easy to cause headaches if you remove or change the order of members as these could have already been saved to a database or stored in some other way. Once you ingest those, what was once blue might become red so keep that in mind. While such declarations might suffice as an enum in some circumstances you usually will expect more from your enum. For starters, you’d like to be able to return the name of the member. Right now, a will print , but how would we print the name? How would I determine if is valid color or not? I’m also able to define a custom color by simply defining a variable . What if I’d like to marshal my enum to their string representation when returning this via an API? Let’s see how we can address some of these concerns. Since we’ve defined Color as a custom type we can implement the stringer interface on it to get the member names. We can now print the name by calling the method on any of the and get the names out. There are many ways one could implement this method but all of them have the same caveat - whenever I add a new color in my constant declaration, I will also need to modify the method. Should I forget to do so, I’ll have a bug on my hands. Luckily, we can leverage code generation with the stringer tool can help us. It can generate the code required for our Color enum to implement the stringer interface. You’ll need to have the stringer tool installed so run to do that. Afterwards, include the following directive, I usually plop it right above my enum type declaration: If you run you’ll see a file appear, with the stringer interface implemented, allowing you to access the names of the members like so: If we use our enum in a struct and marshal it it will be represented as int: Sometimes, this behavior might suit you. For instance, you might be OK if the value is stored as an integer however if you’re exposing this information to the end user it might make sense to display the color name instead. To achieve this, we can implement the method for our enum. I’m specifically implementing the over as the latter falls back to using internally in the std libs json library. This means that by implementing it we will get the color represented as string in the marshalled form for both JSON, XML and text representations and, depending on the implementation, perhaps other formats and libraries. If we’d like to accept string colors as input, we’ll have to do a bit more work. First, we’ll need to be able to determine if a given string is a valid color for our enum or not. To achieve this, let’s implement a function Once again, we could implement this in many different ways, but they will have the downside that if we’re ever expanding our enum, we’ll have to go into the and extend it to support our new members. There are tools that can generate this for us, and I’ll talk about them later. With this, we can implement the method and unmarshal an input with colors as strings like so: If an invalid color is provided, the unmarshalling will result in an error. Similarly, we could implement the and interfaces for database interactions: If you’re working with quite a few enums and need the custom marshalling and stringer/valuer/scanner interface implementations it can become quite tedious having to do all these steps for each of your enums. Everything that I’ve discussed so far can be generated with the go-enum library . With it, the enum definition becomes a bit different: If you run a file will be generated including all the custom marshalling, parsing and stringer/valuer/scanner implementations. This is a great tool if you work with multiple enums and have to do this often. Another alternative that leverages generics and avoids generation is the enum library . Both of these are valid options and it is up to the reader to choose one that suits your needs. I will go over my preferences at the end of this blog post. There’s one caveat with these enums and that’s the fact that one can just construct a new enum member by hand. There’s nothing preventing me from defining a and passing that to a function expecting a color: Firstly, I would like to say that there is no bulletproof way to protect from this, but there are some things you can do. If we define our enum as a struct in a separate package: You would obviously include ways to construct valid enums from outside the package by either the ID or name, the methods for serializing, stringifying and any other needs you have. However, this only provides an illusion of safety, since you can do any of the following: For 2) we could shift our enum by 1, and include an unknown value with the id of 0. We’d still have to handle this unknown value in any code dealing with colors: Since structs can not be declared const, we have to become inventive to cover ourselves from 1). We can define the colors as funcs: With this in place, you can no longer assign a color as another. An alternative approach would be to make the color type an unexported int and only export its members: To make this type even remotely useful, we could export and implement a Colors interface: You could then use it like this: In theory, you could still write a custom implementation of this interface and create a custom color like that but I think that is highly unlikely to happen. However, I’m not a big fan of these approaches as they seem a tad cumbersome while providing little in return. With all this said, I’d like to point out a few things that have been working quite well for me in practice and my general experience: This is just my opinion and it might not match the situation you’re in. Always choose what works best for you! If you have any suggestions or alternatives, I’d be glad to hear them in the comments below.

1 views
Pinaraf's website 12 years ago

Review – “Instant PostgreSQL Starter”

Thanks to Shaun M. Thomas , I have been offered a numeric copy of the “ Instant PostgreSQL Backup ” book from Packt publishing, and was provided with the “ Instant PostgreSQL Starter ” book to review. Considering my current work-situation, doing a lot of PostgreSQL advertising and basic teaching, I was interested in reviewing this one… Like the Instant collection ditto says, it’s short and fast. I kind of disagree with the “focused” for this one, but it’s perfectly fine considering the aim of that book. Years ago, when I was a kid, I discovered databases with a tiny MySQL-oriented book. It teaches you the basis : how to install, basic SQL queries, some rudimentary PHP integration. This book looks a bit like its PostgreSQL-based counterpart. It’s a quick travel through installation, basic manipulation, and the (controversy) “Top 9 features you need to know about”. And that’s exactly the kind of book we need. So, what’s inside ? I’d say what you need to kick-start with PostgreSQL. The installation part is straight forward : download, click, done. Now you can launch pgadmin, create an user, a database, and you’re done. Next time someone tells you PostgreSQL ain’t easy to install, show him that book. The second part is a fast SQL discovery, covering a few PostgreSQL niceties. It’s damn simple : Create, Read, Update, Delete. You won’t learn about indexes, functions, advanced queries here. For someone discovering SQL, it’s what needs to be known to just start… The last part, “Top 9 features you need to know about”, is a bit more hard to describe. PostgreSQL is a RDBMS with included batteries, choosing 9 features must have been a really hard time for the author, and I think nobody can be blamed for not choosing that or that feature you like : too much choice… The author spends some time on pg_crypto, the RETURNING clause with serial, hstore, XML, even recursive queries… This is, from my point of view, the troublesome part of the book : mentioning all these features means introducing complicated SQL queries. I would never teach someone how to do recursive queries before teaching him joins, it’s like going from elementary school to university in fourty pages. But the positive part is that an open-minded and curious reader will have a great teaser and nice tracks to follow to increase his knowledge of PostgreSQL. Mentioning hstore is really cool, that’s one of the PostgreSQL feature one have to know… To sum up my point of view about this book : it’s a nice book for beginners, especially considering the current NoSQL movement and people forgetting about SQL and databases. It’s a bit sad we don’t have more books like this one about PostgreSQL. I really hope Packt publishing will try to have a complete collection, from introduction (this book) to really advanced needs ( PostgreSQL High Performance comes to mind) through advanced SQL queries, administration tips and so on… They have a book about PostgreSQL Server Programming planned next month, I’m really looking forward to this one.

0 views