Latest Posts (20 found)
flowtwo.io 1 weeks ago

Stuff I Dug in 2025

The end of the year provides an opportunity to look back on all the media and entertainment I enjoyed over the last 12 months. I like taking some time to reflect on what had an impact on me, and why I liked it. It's fun to do during the holidays when I generally have more free time to write. So without further ado, here's my favourite music, film, and TV from 2025... Speyside is off of Bon Iver's 2025 album SABLE, fABLE. It's a relatively straightforward composition—just an acoustic guitar, some violin, and Vernon's soft vocals. But the lack of elements allows each one to stand out and add exactly what it needs to the sound. Each twang of the guitar rides along with Vernon's crooning in beautiful harmony. And small breaks, the sonic negative space, gives breathing room for the lyrics to sink in. A minimal masterpiece, in my opinion. I've been a fan of Milky Chance since university. Their first big hit, Stolen Dance , was one of my favourite songs back in those days, which was like...2013. Damn. In the (many) years since, this German duo released several more albums and EPs. I've always found a few tracks on those releases I liked. Their sound has stayed consistent—spaced-out, electronic folk rock. It's funky and easy to get lost in. Their latest release, Trip Tape III , is a continuation of their Trip Tape series. As the name implies, they're mixtapes instead of a proper album. They contain covers, unreleased demos and original songs all blended together into a perfect lazy-day-on-the-beach soundtrack. Or a summer road trip. Or playing pickleball with your Uncle. Whatever it is, they make good vibes. I had Trip Tape III on repeat for months this year. I love Camouflage , Million Dollar Baby , and Naked and Alive —all standout tracks for me. So for this stellar mixtape, and for continuing to deliver these upbeat indie vibes for over a decade, Milky Chance is my artist of the year. First off, this guy's name isn't Barry, it's Joshua Mainnie. Secondly, I'm unsure whether he can swim or not. But what I am sure of is his ability to make incredible dance music. Barry's—err, Joshua's —first album, When Will We Land was a launching point for his career (no pun intended). It received praise for the vibrant, "organic" sound superbly crafted by Mainnie. It's upbeat, unruly and has plenty of variety. Barry Can't Swim released his second album, Loner , this summer. It's reminiscent of his first album in all the right ways. Samples, beat patterns, and instruments all layered into an evolving melody that blends seamlessly as the album plays out. It's all danceable but feels very raw and emotional at the same time, probably because of the heavy use of vocal samples. Loner opens with the insanity inducing The Person You'd Like to Be , a sort of sonic ego trip that includes positive affirmations from robots and drawn out chords that sound like sirens. But after this crazed start, Mainnie takes us on a ride to a daytime dance party. Kimpton is bouncy and bright, complete with horns, steel drums, and some sort of chanting chorus. Things start to mellow out near the end of the album— Like It's Part of the Dance is a favourite of mine. I watched Past Lives while on vacation last February. I'd heard good things about it—it premiered in 2023 and received lots of praise, including Oscar nominations for Best Picture and Best Original Screenplay. It was one of those stories that leaves you with an odd nostalgic feeling afterwards, and not only because it's a story about childhood love. It's also the way the movie is structured; It takes place over 3 periods separated by 12 years in between: 1999, 2011, and 2023. Thus it provided a bird's eye view into the main character's life at these different stages. You see how she grows up and how certain paths she takes have ripple effects years into the future. It all just made me think about about quickly we age, and how our life will only ever play out once. I also appreciated how un-Hollywood the story was. It doesn't end with any grand gestures or dramatic rekindling of a childhood love. It ends very realistically, just a quiet goodbye between two friends and an acknowledgement of life's what if's. Life will take you in many directions but you'll always carry your memories (or, past lives) within you. Okay, so I first watched Blackberry back in 2023 when it came out in theatres. But I re-watched it earlier this year, so it still counts. Matt Johnson is a Canadian director best known for his television series Nirvanna the Band the Show . It's a hilarious mockumentary series that stars him and co-creator Jay McCarrol conspiring to get their band—named Nirvanna the Band —a gig at the Rivoli. It's one of my favourite TV shows of all time. Not just because of it's hyper-local setting and comedy, it's also a uniquely funny show. Blackberry was Johnson's "breakout" film in the sense that it was his first with a multi-million dollar budget. It received critical acclaim and numerous awards at the Canadian Screen Awards. And rightly so, because it's a masterfully executed film. Johnson carefully interweaves his signature fast-paced comedy into a real story about the rise and fall of one of the landmark technologies of the 21st century: the Blackberry. It was dramatic, nerdy, and seriously funny at the same time. just casually showing up for your movie premiere in sweatpants and a Jays T-shirt In 2025, Johnson premiered his next film, Nirvanna the Band the Show the Movie —a spiritual successor to the TV show. I unfortunately haven't seen it yet; I'll have to wait for the theatrical release in early 2026. I really appreciate Johnson's love for his home and how he stays true to this in his work. He wants to change the idea that Canada is just a cheap place to film American movies and TV. It's also a place with it's own stories; stories that deserve to be told. I'm looking forward to seeing what Johnson will sink his teeth into next. I read he's directing an Anthony Bourdain biopic and a Dungeons & Dragons movie. I didn't watch too many documentaries this year to be honest. In any case, my selection for the most impactful documentary I watched is The Present . And fortunately for you, it's available in it's entirety on YouTube. It's a short, but beautiful film about Dimitri Poffé, a young man from France who was diagnosed with Huntington's disease in his 20s. The documentary follows Dimitri's bikepacking journey across Central and South America in an effort to raise awareness for the disease. As a novice bikepacker, the premise was enough to hook me. But it turned out to be much more than just another YouTube bikepacking recap. Overlaid with an incredible monologue from Dimitri himself, The Present focuses on time, and specifically the time we have here on Earth. What we can do with it and what we're capable of. It was really moving and sad at times, but ultimately it delivers an important message that anyone could benefit from hearing. Adventure becomes a way to feel truly alive. It becomes a way, even for a moment, to stop the ticking clock of life — Dimitri Poffé I'm not finished Demon Slayer yet, but this has undoubtedly been the most entertaining TV I've watched all year. Normally I'm not a huge fan of Anime, but I decided to give this show a try based on a recommendation from a friend. Demon Slayer is an adaptation of the Japanese manga series of the same name, published between 2016-2020. The anime is a few years behind, so it only concluded in July of this year. It's action-packed, it doesn't take itself too seriously, and the art direction is wildly creative. Demon Slayer takes place in a fantastical version of Japan full of demons and demon slayers, all of whom have a flair for the dramatic. If you're like me and haven't watched much anime, then the dialogue might throw you off a bit. It's very...explicit. Every character states their intentions and actions directly, either out loud or as an internal monologue. It can sound a bit melodramatic at times. Overall—it's a really fun show to watch. only downside is Zenitsu is the most annoying character on television Seth Rogan's The Studio was a rollercoaster ride of a series. The concept is probably the easiest thing to get greenlit from a studio, Hollywood loves a show or film about itself. The cinematography stands out for me; the show is mostly composed of long running shots and dialogue driven scenes. And it moves along at a breakneck pace—always tense and on the verge of collapse. This makes for good comedy albeit with an elevated heartrate. I also loved the music in The Studio . The show uses an original score of mostly drums with only small flourishes from other instruments at key moments. This percussion-heavy soundtrack complements the show's pace and emotionally-charged dialogue so well. Episode 2, The Oner , exemplifies all the best aspects of The Studio . It takes the extended shot theme to it's extreme by filming the whole thing as a single shot. Not only that, the episode is about a movie set where the crew is attempting to film a single-shot sequence. So it's all very meta and self-aware. It's completely unhinged and disastrous due to Rogan's character (the studio executive) trying to be helpful but accomplishing the opposite. It also establishes the kind of person he is for the rest of the series—idealistic, friendly, but lacking self-awareness. It's hilarious TV, give it a watch if you're looking for a laugh.

0 views
flowtwo.io 1 months ago

On 10 Years of Writing a Blog Nobody Reads

In November 2015, I started a blog on Blogger. My first post was a book review of The Martian by Andy Weir. 10 years and a couple of blog migrations later, I'm still writing. I wanted to share some thoughts and learnings I picked up throughout this time. Some of it is specific to writing a blog, but some is generally applicable to writing in any format. goodnight sweet prince One of the main reasons I maintain this blog is to become a better writer. I really appreciate when someone's writing feels effortless. Whether it's in a book, an article, or even a technical document—communicating effectively is a fine art. I'm not there yet, but I enjoy the process of improving. My style has certainly improved since my early days of writing. Reading my old stuff is painful. I would use too many qualifiers and verbose phrases. It was a direct translation of the way I spoke, which turns out is a bad strategy for how you should write. If your goal is to have other people read—and hopefully enjoy—your writing, you should make an effort to edit your thoughts. Here's a sample of the useless phrases I would add to the start or end of almost every sentence: This was my worst habit when I started. It's just fluff that makes it exhausting to read. It's redundant to say "I think" at any point in an opinion piece. keep all that pondering to yourself buddy Using this "careful" language just softens your ideas to the point of being inarguable. If you start a sentence with "I feel..." then no one can dispute anything that follows, since it's just your feeling. This is boring to read. Writing a blog, or anything really, is your contribution to public discourse. Sure, this blog only averages 10 page views a week (9 are bots and 1 is me) but I'm still throwing my ideas out there into the digital ether. If you're publishing something on the internet, you might as well stand tall behind your words and wait for someone to call bullshit. Using multiple adjectives is another bad habit I struggled with in the past. Phrases like: These are unnecessarily descriptive and, more often than not, redundant. Just use one really good punctilious adjective instead. Open a thesaurus if you need to. My goal now is to use less words to convey an idea. Everyone's interpretation of words is different, so using more precise language will just muddle your ideas. To use a metaphor from electronic communication—there's so much noise in the channel that modulating your signal doesn't provide any extra information. The writing process should be highly iterative—many drafts are needed before you arrive at something you're happy with. Taking time between drafts can help too, so you come back to it with a different perspective on what you wrote. If we're talking about a blog, there's really no strict timeline for getting a piece of content out, so when you choose to publish is up to you. Even after publishing, there's nothing that stops you from updating the content afterwards. You should write down ideas when you have them. Literally, I wrote the genesis of this paragraph while in bed at 5am in January. You never know when inspiration will strike, so I find it best to get the thought down quickly and then expand on it later. It really helps to make the ability to write as accessible to you as possible. For example, I use Obsidian for all my drafts now. It has cross-device support with cloud syncing, so "writing from anywhere" (mostly my phone) is easy now. I can now publish my smart toaster review directly from my smart toaster There's a lot of talk about the value of "manual" writing in the age of generative AI. GenAI, specifically Large Language Models, can be thought of as calculators for writing; they can generate coherent written ideas instantly from any input. So just like how nobody does long division by hand anymore, maybe people won't do much writing by hand one day. The introduction of GenAI has increased the surplus of written content to infinity, essentially. So from an economics standpoint, without any resource scarcity the value of written words has been reduced to zero. But is there still value in human produced writing? Subjectively, yes. Objectively? I'm not sure. I think there's a lot of personal value in writing though. Book reviews, for example, are essential for gaining a better understanding of what you read. It helps crystallize the knowledge in some way and integrates it into your mental map of the world. The reviews I post vary in content—sometimes it's a critique, or a summary, or an extrapolation of a concept from the book I'll do additional research on. Either way, this process helps to remember something about the book long-term. I think of it like breathing but for ideas. We do so much reading all day—there should be a natural balance with producing words too. Inhale, exhale, inhale, exhale... And I'm still not a great writer by any means. There's a lot of ways to improve, which is kind of motivating and excites me to keep writing. I often write "too much" and struggle to really condense my thoughts into a sharpened essay. Most of my posts are 2000+ words...nowadays I'm trying to restrict myself to 1000 words. The limit forces me to really think about the core idea I want to share. *checks word count* Thanks for reading! I believe... It feels like... It seems that... In my opinion... ...Interesting and thought-provoking... ... broad, wide-ranging... ...detailed and well-written...

0 views
flowtwo.io 3 months ago

There is No Antimemetics Division

"There is an invisible monster which follows me around and likes to eat my memories," Marion explains, patiently. "SCP-4987. Don't look it up, it's not there. I've learned to manage with it. It's like a demanding pet. I produce tasty memories on purpose so it doesn't eat something important, like my passwords or how to make coffee." — qntm, There is No Antimemetics Division , Ch. 7, para. 72 There is No Antimemetics Division was written by Sam Hughes, better known as "qntm" online. The book was originally published online as contributions to the SCP Foundation Wiki . The SCP wiki is an open community of writers that have been building a shared fictional universe around a secret organization that protects humanity from paranormal anomalies. It's like an open source Men In Black. Hughes’ contributions to the wiki, which eventually became this standalone novel, takes aim at the most insidious anomaly of all — the things you can’t remember. before reading, please forget that neuralyzers ever existed The writing is not the book's strongest asset. The characters feel flat and inconsequential—merely there so someone can observe the strange phenomenon taking place. The dialogue often veers into awkward, almost parodic exchanges. At times it reads less like a novel and more like a script for a dramatized Wikipedia entry. Antimemetics sits at the intersection of horror and science fiction—it's definitely trying to be frightening. But effective horror needs rhythm and restraint; it needs quiet before the scream. Only once the reader is immersed can they be truly impacted by the weight of a scary event. In Antimemetics , everything is cranked to eleven from the start. Each chapter feels like a climax, escalating from "something's off" to a full-blown catastrophe within a few pages. The result isn’t suspense—it’s exhaustion. When every moment is chaos, none of it feels real. That said, qntm deserves credit for some creative writing and presentation. For example, the use of redacted passages and corrupted text to evoke the sense of memory loss associated with these antimemetic phenomenon: ██ 2010 ██ ███ discovered ████ the antimemetic camouflage ██████, also characterised as "decay" or "corrosion", was spreading through paper records of ███-████. As of ████, more ████ 60% of █████ documents are ██████████, even with █ strong mnestic dose. The effect is even ████████ ████ ████ SCP entry itself, despite ███ ███ shielding and redundancy in ████ system. — qntm, Ch. 13, para. 16 These stylistic choices, along with shifting narrative formats—SCP Foundation logs, secret reports, and third-person storytelling—give the book a textured, documentary feel. It’s inventive, if not always cohesive. The story's best trait, by far, is its core concept of antimemes. Antimemes are ideas or entities that resist perception and memory; one of the more unsettling concepts in recent sci-fi. It's psychologically spooky...having monsters that can roam the Earth causing havoc and terror, but no one remembers they exist. The whole premise of the book is anchored around memory and how it's both weaponized, and defended against, by the members of the Foundation. Hughes builds a world where entire teams of scientists and engineers wage psychological warfare against forgetfulness itself. They use a variety of techniques to combat these entities: memory-blocking drugs ("amnestics"), memory- preserving drugs ("mnestics"), and cognitive isolation rooms, to name a few. The way they partitioned one's memory reminded me of Severance , and their efforts to hide these anomalies using memory erasing technology recalls the Men in Black movies. But the Antimemetics division goes further—asserting that memory, more than truth or courage, is the real weapon against oblivion. SCP-9431: the weird goat room After finishing the book, I found out via qntm's blog that he's signed a publishing deal to re-release Antimemetics as a "fully overhauled, thoroughly rewritten version", slated for release by the end of 2025. Frankly, I'd wait for this new version to be available. The overall concept is fascinating, but the lacklustre prose and choppy storytelling often gets in the way. A cleaner, more polished rewrite could turn this into a genuinely poignant and memorable story—assuming, of course, you can remember it.

0 views
flowtwo.io 4 months ago

Automating Quordle With an LLM

At my current company, there's been a big push to use AI to automate everything we do. Anything that requires human intervention, isn't super critical, and happens regularly is considered a good candidate to be automated with agentic AI. For the last 3 years, I've started most of my work days by playing the daily Quordle — a Wordle variant that requires you to solve 4 words concurrently in 9 guesses or less. I find it challenging enough (compared to Wordle) that I don't win every time. It's also a good way to get the brain juices flowing. But with the rise in AI tooling replacing every other aspect of my job, I won't need to use my brain much anymore. So I decided to automate this part of my day too. If you don't care about the details, you can check out the final result here . I'd suggest trying to play today's Quordle first before seeing how the AI did! I started out with a basic model, , because I honestly expected it to excel at this sort of game. I've become so accustomed to LLMs exceeding my expectations when I try something new. To my surprise, did not understand the assignment. It failed to solve the Quordle every time, badly. Here's an example of the prompt I was using: For one thing, it guessed "apple" way too often. You wouldn't think the example response wouldn't be that influential. On a few of the runs, it would inexplicably stick to a fruit theme too. LEMON, MELON, APPLE again...etc. So I tried just throwing more AI at it. I switched the model to instead. It improved slightly—it varied the words more. GPT 4....now with non-fruit related words But it still didn't get close to solving it whatsoever. I thought it could be the prompt I was using. I wasn't actually explaining the rules of Quordle at all. It's possible there isn't much Quordle information in the model's training data. We all know LLMs are prone to being very confident and very wrong. I revised the prompt to the following: But alas, things didn't improve much. Turns out it wasn't because the model doesn't know the rules. Next, I took a look at how the words in the prompt were being tokenized. I have a basic understanding of tokens and I guessed that the individual letters weren't being parsed and correlated correctly. I used OpenAI's tokenizer tool to see how the game state looked to the AI, that MUST be why! The LLM can't "see" the individual letters of the previous guesses, and by extension it can't correlate them with the proceeding board results. That information is crucial for a letter-centric game like this. I modified the guess word strings by splitting them into individual letters with spaces in-between; this was the result: That looked better. Now the LLM would injest a different token for each character in the 5 letter word. I modified the serialization code to split the guess words up in the same way. But as you might've guessed from the length of this blog post, that STILL didn't improve things... there's no participation medals here chatgippity... Now I was starting to wonder if LLMs just aren't capable of doing the "word math" that is necessary for a game like Quordle. To test this theory, I tried tasking an LLM with a simpler game first: Wordle. Instead of modifying my program, I ran the test using the chatGPT web UI. The result was very promising. The LLM performed WAY better when it was able to reason through the problem first. This is the thorough response it gave me: ChatGPT Source The key difference here is my original prompt instructed the LLM to return a single 5-letter word only, but the prompt I gave ChatGPT didn't specify that. I designed the original prompt that way to be able to parse the response programmatically. This test showed the LLM needs to work through an answer via output tokens. This result led me to change the prompt in my program to instruct the LLM to reason about the answer before ending its output with a specific format: This allows the LLM to reason as much as it wants while still allowing me to parse the final answer using a simple string search afterwards. Lo and behold, it finally started getting some words right: Unfortunately, it became super slow due to the drastic increase in output tokens. After seeing the improvement by allowing the LLM to reason, I upgraded the model again to , which is one of OpenAI's "reasoning" models. This change resulted in my program's first official Quordle win! However, it wasn't very reproducible—I think it got sort of lucky that first time. I also found it was inconsistent with returning the format I requested. I was getting frequent errors when parsing the chat completions. I switched to using structured JSON outputs to eliminate those. This feature is only available on the newer OpenAI models. It allows you to specify a JSON schema that the chat completion is guaranteed to adhere to. After switching to structured outputs, the parsing errors were completely eliminated. It was still taking a looooong time to get responses from the model, especially on later guesses. I had a request timeout of 180 seconds, but it was exceeding that regularly. The fact that the later guesses took longer made me think it would benefit from reading all of it's previous responses / reasoning. I know it's common practice to do this, but I figured since the whole game state could be encoded in a single message, there wasn't any need to pass the previous messages. Once I started sending the message history to the model on each subsequent guess, it started answering quicker. It could now use the previous reasoning to come to a conclusion faster. I also changed the game state representation one more time. I encoded both the complete word, AND the individual letters with their corresponding result in a more condensed format: These two changes really improved the success rate of the program. Some things I learned as a result of this project: Early on, I really thought I'd discovered a task that AI was actually not good at. But after a bit of "prompt engineering", I managed to build a reliable and successful automation for this complex word puzzle game. Check it out here , and thanks for reading! Chat-based LLMs don't actually "think" behind the scenes. The way they mimic thinking is by producing output tokens. Although the newer class of reasoning models do this implicitly and return summaries of their own outputs. Similarly, an LLM isn't really reading the words, punctuation, and formatting you're sending it—the way a human would. All it's consuming is a sequence of numbers. The way information is encoded and then tokenized can be critical for certain use cases. Sending previous chat messages helps LLMs when performing multi-step tasks that require reasoning and deduction. The context speeds up their responses.

0 views
flowtwo.io 4 months ago

Broken Money

In some sense, the circularity of the financial system is almost poetic; it represents how dependent we all are on one another. However, it’s also very fragile. Everything is a claim of a claim of a claim, reliant on perpetual motion and continual growth to not collapse. — Lyn Alden, Broken Money , Ch. 23, para. 24 When I started paying more attention to Bitcoin, I felt a desire to develop a better understanding of money in general. It seemed necessary if I wanted to really "get" Bitcoin and why it was so important. I looked for a book that could explain how money really worked, in an accessible format. I couldn't find anything at the time. That was around 2019, before Broken Money was written. Broken Money was published by Lyn Alden in 2023. It's a really approachable text that describes the history of money, different forms of money, and the underlying qualities that make money useful. In his article “On the Origin of Money,” Menger described that an ideal money transports value across both space and time, meaning that it can be transported across distances efficiently or saved for spending in the future. — Alden, Ch. 8, para. 20 Money is a system that efficiently transports value across space and time. I'd add "people" as a 3rd dimension of transport. I think that's a good definition to start with. There's no way to argue that the author isn't biased to some extent. She's both personally and professionally invested in the success of Bitcoin and therefore is going to make the problems with the current financial system as pronounced as possible. With that said, I don't think she's making this stuff up. Most of the explanations and theories presented were believable to me, and the evidence is hard to ignore. But we have to accept that macroeconomics is incredibly complex and the best we can do is have theories. Modern Monetary theory, Austrian economics, Chicago School of Economics, girl math....these are all popular schools of thought that attempt to explain how money and the economy works. But the economy is a complex, dynamic system of forces and no one theory can perfectly explain it. Alden spends a good deal of time in the book writing about the history of money and how we arrived to our present day financial framework. Bitcoin isn't mentioned until chapter 20 actually. This dissection of money really highlighted many of the flaws and limitations in our global monetary system. I'm going to focus on the parts I found most interesting and, frankly, concerning. Today, every fiat currency on Earth is inflationary. This just means the value of a "dollar" (in the general sense) decreases over time. It's highly debatable whether this is good for a society, and who it's good for. Alden makes the case that inflation is counter-intuitive to how prices should work—but it's a necessary evil that the government enforces so our highly leveraged financial system doesn't collapse. A 2% inflation target means that prices on average will double every 35 years. This is interesting, because ongoing productivity gains should make prices lower over time, not higher. Central bankers do everything in their power to make sure prices keep going up. — Alden, Ch. 25, para. 69 The problem with constant change to the "price" of a dollar makes it hard to make long-term financial plans. Prices are the only mechanism for communicating information about value, so if these prices change over time in non-predictable ways then we can't properly reason about long-term saving and spending decisions. In general, inflationary money rewards debtors (people who owe money) and incentivizes spending. At first that sounds fine, since the most financially vulnerable people in society are usually those in debt. But the total amount of debt owned by the lowest earners in society doesn't even scratch the surface compared to the debt owned by the largest corporations, and even the government itself. So really, inflation rewards those at the top of the economy, it debases people's savings, and it incentivizes consumption and spending. It's a roller-coaster we have no choice but to ride. The breakdown of the modern banking system was eye opening for me. There's a distinction between the base money supply, which is all the money that actually exists, and the broad money supply, which is the total amount of dollars in circulation in the economy. Maybe you are as surprised as I was to find out these aren't the same thing. In essence, base money is all the dollars that have been created by the government's central bank; either by printing money or by issuing treasury reserves. Broad money is what you get if you added up every individual and corporate bank account balance in the country. For both base money and broad money, most countries currently work the same way as the United States. A country’s central bank manages the base money of the system, and the commercial banking system operates the larger amount of broad money that represents an indirect and fractionally reserved claim to this base money. — Alden, Ch. 24, para. 28 What I took from this is that every dollar you see in your bank account does not represent a whole "dollar loan" that you'd be able to go claim anywhere. It's a fraction of a fraction of a claim on a real dollar somewhere in a huge system of hierarchical ledgers. Money lent from one institution can be deposited at another institution and immediately (and fractionally) lent from there, resulting in the double-counting, triple-counting, quadruple counting, and so forth, of deposits relative to base money. At that point, people have far more claims for gold than the amount of gold that really exists in the system, and so in some sense, their wealth is illusory. — Alden, Ch. 13, para. 21 Although this is exactly how fractional reserve banking is designed to work, it still makes me feel uneasy. Everyone is just loaning assets they don't own, buying and selling these loans, and in general just creating money out of thin air based on false promises. It's a shaky foundation that our entire society depends on. The biggest flaw I see with modern economic systems is how much power is centralized—a small group of individuals make all the decisions on how much money to print, what the cost of borrowing should be, and other monetary policies that influence millions of people. Although this is mainly a consequence of democratically elected leadership, the fact that humans make these macroeconomic decisions on behalf of everyone seems fallible at best, and downright corruptible at worst. It only takes one unethical or despotic leader to destroy a national currency: To a less extreme extent — as I describe later in this book — this is sadly what happens throughout many developing countries today: people constantly save in their local fiat currency that, every generation or so, gets dramatically debased, with their savings being siphoned off to the rulers and wealthy class. — Alden, Ch. 8, para. 83 It's seen time and time again in developing countries, sadly. Even in non-developing countries, economic policy tends to favour those who already have money and, by extension, political power. That means big corporations and their wealthy owners. Over a 2-year period from the start of 2020 to the start of 2022, the broad money supply increased by approximately 40%. Printing money in this way devalued savers, bondholders, and in general people who didn’t receive much aid, and rewarded debtors and those who received large amounts of aid (keeping in mind that the biggest recipients of aid were corporations and business owners) — Alden, Ch. 27, para. 18 This sort of hair-trigger, reactionary decision-making is kind of unavoidable with the system of government we've devised. Democracy works in extremes and pushes those at the top to make rash decisions to appease voters and to maintain the appearance of leadership by making change for the sake of change. In essence, what I'm saying is human decision making is too flawed and influenced by emotion to be the way we make these decisions. People being at the centre of national fiscal policy is bad enough, but in the case of the United States, it's even worse. Because the U.S Dollar is the world's base currency, that means the decisions made by members of the Federal Reserve and Treasury Department affect the entire world. The buying power of every other currency is measured relative to USD, so if the U.S government decided to print a ton of money and give it to themselves, they are effectively stealing from the the rest of the world. This seems like an unfair advantage for one country to have. I know life isn't fair, but I believe we, as a global society, could come to a consensus on a way to transact across borders that doesn't depend on any specific country's economy. The most shocking part about this system is that it's not actually beneficial for America long-term! it artificially increases the purchasing power of the U.S. dollar. The extra monetary premium reduces the United States’ export competitiveness and gradually hollows outs the United States’ industrial base. To supply the world with the dollars it needs, the United States runs a persistent trade deficit. The very power granted to the reserve currency issuer is also what, over the course of decades, begins to poison it and render it unfit to maintain its status. — Alden, Ch. 21, para. 8 This is certainly debatable, but it makes sense intuitively. If one country is allowed to issue currency which is globally accepted, and it's the only country with this ability, then their currency will carry an extra monetary premium above all others. This "built-in" economic premium granted to the American people allows them, collectively as a society, to rest on their laurels and not have to work as hard. In other words, America has the option to "buy instead of build" because they are so wealthy. This is the fundamental reason for the trade deficit it has with almost every other country. Learning about this was highly relevant in 2025 in the midst of the trade war the current U.S President has launched. You could view the tariffs he's introduced as a way to neutralize this monetary premium and force their stagnated economy to start building and manufacturing in a way they haven't needed to since the Bretton Woods system was established. After a lengthly explanation of the history of money, and then several chapters bashing the current monetary system, Alden finally introduces Bitcoin to the reader. I won't go into much detail here as there are plenty of better resources than me which will explain Bitcoin's core concepts, if you're interested. I'd also recommend reading this book. It's explains Bitcoin really well. I'll briefly summarize how Bitcoin attempts to solve the problems I discussed above. Bitcoin is a deflationary currency. It has a fixed supply of 21 million total coins, which means it's purchasing power will trend upwards over time. In the best case scenario, this means everyone will continuously get richer as we all equally benefit from improvements in production efficiency and technological innovation. In the worst case, it means society comes to a halt as people delay purchases indefinitely waiting for their savings to be worth more. Either way, I believe a globally recognized, alternative currency model would be a healthy counter-balance to our existing fiat currency systems. At it's core, Bitcoin is a bearer asset. Ownership of Bitcoin is instantly verifiable via the blockchain ledger. Money, in it's physical form, is similar in that it's a bearer asset. But the dollars in your bank account don't represent ownership at all. They're a promise by your bank to give you that amount of dollars if you asked for it. This promise can't always be fulfilled. Bitcoin is unique in that it's purely digital, yet it has the same qualities as physical dollars. Finally, Bitcoin—as a monetary system—is completely decentralized. No single entity or government has any control over its rules. And it's rules are decided algorithmically and predictable for the rest of time, in theory. Nothing can change about Bitcoin unless the change is accepted by a majority of participants in the system . Couple that with the fact that Bitcoin's value is directly tied to the network size and its popularity as an accepted form of currency. So its incentive structure is designed to ensure the network will remain fair and accessible to everyone. Otherwise, no one will want to use it. Is it perfect? No...but it's fairer than how monetary policy is defined today. Broken Money is a sobering look at the state of money today. It traces the origins of money throughout human history—from the rai stones of Yap island to the post-COVID global inflation surge of the 2020s. It was well researched and well written. I don't think Bitcoin is going to overtake the fiat currency systems of the world. But I believe it's going to be around for a long time, acting as a hedge against the government's centralized control of money. In the worst case, it will act as a store of value akin to digital gold. In the best case, we will continue to innovate and build technology on top of Bitcoin that expands its utility in both familiar and novel ways. In a recent post on Nostr , Alden makes the case that Bitcoin is something like an open-source decentralized Fedwire , the settlement system that underpins the entire U.S banking industry. This feels like an apt comparison to me—mainly because the Bitcoin network can support basically the same transaction throughput as Fedwire can. Maybe one day Bitcoin will become the global settlement system for an entirely new class of banks and financial service providers. In my view, open-source decentralized money that empowers individuals, that is permissionless to use, and that allows for a more borderless flow of value, is both powerful and ethical. The concept presents an improvement to the current financial system in many ways and provides a check on excessive power, which makes it worth exploring and supporting. — Alden, Ch. 41, para. 79

0 views
flowtwo.io 9 months ago

Is Solo Bitcoin Mining a Good Investment?

I recently saw a headline about a "solo miner" successfully mining a Bitcoin block and receiving the coveted 3.125₿ mining reward. Actually, there's been a few of these stories in the past couple months. It always generates a bit of buzz in the Bitcoin community; it's proof that the Blockchain is truly decentralized and anyone can participate in it. All you need is an internet connection and a computer. It's also sort of analogous to seeing a headline about someone winning the lottery. But does seeing a story like that ever entice you to go buy a lottery ticket? For most people, the answer is no, because it's common knowledge that the lottery is a bad investment and the odds are stacked against you. But this made me wonder, is that the case for Bitcoin mining too? And what are the variables that will turn it from a bad investment into a good one, if so. The reality of Bitcoin mining today is that it's become extremely centralized . Just 4 companies provide over 75% of the current global hashrate. And one company, Foundry USA, owns about 1/3 of the total hashrate. This is not ideal—a distributed network is fundamental for ensuring transactions in the ledger are authentic, censorship resistant, and correct. Too much centralization of mining makes the network susceptible to a 51% attack . Therefore, it's important that Bitcoin mining continues to be accessible to the general public. And since the only real motivation for participating is financial, it's crucial for the investment to be profitable, somehow. How does one determine the profitability of mining bitcoin? Mining the next block in the blockchain involves finding a specific random hash, so it's probabilistic in nature. A simple approximation for a probabilistic outcome is to calculate the expected value (EV) of participating in the network, and then compare that with the cost of doing so. This is straightforward to calculate with a few assumptions, which I will explain below. I'm going to start with calculating the EV of running a single consumer grade bitcoin miner. This can be considered the "base case" that we can extrapolate from afterwards. There are several companies producing these mining units; I'm going to choose one popular and somewhat recent model: the Bitaxe Gamma 600 Series . Here's the specifications we need: Source 1 Source 2 I'm based in Canada, so I'm going to use the average cost of electricity in Canada for this calculation. According to energyhub.org , the national average residential cost of power is $0.192 CAD/kWh. I'm also going to assume you run the miner 24 hours a day for 3 years straight. This provides the time frame to amortize the purchase cost of the miner. It also makes the math easier because we don't need to consider the next bitcoin halving event in 2028. Putting it all together, the cost of running a single Bitaxe miner per month is calculated as follows: So purchasing and running a Bitaxe Gamma 600 Series will cost you $8.46 a month roughly. Now let's calculate the amount of money you can expect to make. Lets start with the current global hashrate for bitcoin mining. According to CoinWarz, the current hashrate is sitting just below 1 ZettaHash per second (ZH/s). It's not often I get to use a unit prefix like zetta, so that's exciting. Zetta is a trillion billion. The hashrate fluctuates quite a bit and will likely keep increasing in the future, but for this calculation I will assume it's 1 ZH/s. The current mining reward, until 2028, is 3.125₿. At current prices, this equates to $367,900 CAD for every successfully mined block. The Bitcoin network is designed to produce a block every 10 minutes. It will adjust the difficulty level for mining blocks based on the recently seen hashrate, algorithmically, to ensure the network maintains a steady production rate of blocks. Therefore, we can expect roughly 4320 blocks to be mined every month. The expected value calculation is simply the chance you will successfully mine a block multiplied by the value of that block's reward. We can approximate the probability of that happening by dividing the hashrate of our Bitaxe by the total hashrate in the network. So, according to the math, you can expect to make $1.91, while spending $8.46 a month, mining bitcoin with a single Bitaxe Gamma 600 Series, in Canada. This equates to a return on investment of -77%. Now, just for fun, let's compare this to buying a lottery ticket. Fortunately someone on GitHub already did the necessary calculations for me (Thank you keitchchhh !). This analysis is specifically for the lotteries available in Ontario, Canada, where I live. The expected values here are based on a single $3 play and is dependent on the jackpot value. Even the low end, $1.40, your expected return on investment is only -53% so it's already a better investment than solo bitcoin mining. So in conclusion, buying a single consumer grade Bitcoin miner is probably not going to make you rich. You'd be better off playing the LottoMAX. But this begs the question, what do you need to do to make mining profitable? I asked an LLM to make a calculator based on all these variables so you can play around with the numbers and see what sort of changes can turn the ROI positive. Monthly Cost: $ 0.00 Monthly EV Profit: $ 0.00 Monthly Expected Value (EV): $ 0.00 Return on Investment (ROI): 0.00 % Next, I made some graphs showing how all these different variables affect the overall ROI. The data visualization code is available on my GitHub here . For all these graphs, I kept all the other variables the same as our "single-Bitaxe-miner-running-for-3-years-straight-with-average-energy-prices-in-Canada" example above. I just varied one input at a time to see the effect on the profitability. First, here's the effect of the global hashrate. At the current levels of ~ 1000EH/s, the profitability of your single Bitaxe is basically bottomed out. But it exponentially increases as total hashrate decreases, reaching profitability around the 225EH/s mark. Next, let's look at how the cost of electricity affects the ROI. It doesn't affect it much. Even if electricity was free, the ROI would still be -68%, which suggests, in this Bitaxe example, most of the cost is coming from the hardware itself and not the power it consumes. Next, what if we suddenly time travelled back to 2008 when the mining reward was 50₿? And let's pretend it still had the same exchange rate it does today. Would we be profitable then? Yes, we would be. In fact we'd only have to travel back to 2012 to receive an awesome 84% return on our investment. Of course, this example is pretty baseless since the mining reward and the overall price of Bitcoin are highly correlated . Speaking of the Bitcoin price, how do changes in its exchange rate with fiat currencies (like the Canadian dollar, in my calculation) affect our ROI? Let's see. It has a positive linear relationship, which might have been obvious to the more statistically inclined readers. If we entered a bull market where the price of 1₿ exceeded $500,000 CAD, our single Bitaxe miner would be looking like a good investment all of a sudden. But yet again, the price of Bitcoin is not entirely independent of all the other variables in our equation. Namely, the total hashrate in the network would likely increase as Bitcoin's price rises. That's just how a free market works. Here's the ROI compared against the number of years you keep your Bitaxe miner running. Let's just pretend the mining reward stays the same. Even if you kept your single Bitaxe running for 20 years straight, and thus were able to amortize the cost of the hardware over that entire period, you'd still be sitting at a -40% ROI. I guess it's not about how long you play the game either. So what is in our control? Obviously, it's how big of a mining setup you have! Instead of 1 Bitaxe, lets buy 100! Huh...I guess it's going to help much. In fact, it wouldn't help our ROI at all. Even if you owned a million Bitaxe Gamma 600 Series and ran them 24/7 for 3 years, your return would still be -77%. What I've learned is that what really determines your mining profitability are 3 things: The first one is entirely dependent on how you source your electricity for mining. The last two are based on the miner you choose and can be compared nicely using this site: https://www.asicminervalue.com/efficiency For example, if we look at the #1 rated miner for efficiency on that site, the Bitmain Antminer S21 XP+ Hyd —it boasts a hashrate of 500Th/s while consuming 5.5kW of power, which results in an efficiency of 11J/Th. Compare that with the Bitaxe Gamma, which has an efficiency of 14J/Th. Admittedly, that doesn't sound like much more. But when you scale to computing 1000s of Terahashes per second, all that energy saving makes a difference. For instance, if we could lower our energy cost from the national average of 0.192 $/kWh down to 0.05 $/kWh, then mining with the Bitmain Antminer S21 XP+ Hyd would result in an ROI of 9%. If we bought a lot of them and got a bulk discount of 15% on the hardware too, suddenly our ROI jumps to 23%. And if we could keep them running for 5 years instead of 3, our expected return on investment is now 71%. So there's an example of a way to make mining profitable. This was a highly simplified analysis. There was a bunch of things I didn't consider: So don't take this as financial advice to go buy 100 Antminers and put them in your basement. But I hope you learned something. Let me know in the comments if I made any egregious statistical errors in my analysis, or didn't account for something else important! Feedback is always appreciated. Thanks for reading. Energy consumption: 17 W Hashrate: 1.2TH/s Cost: $220 CAD How much your energy costs How much the mining hardware costs How energy efficient the miners are, calculated in Joules per Terahash computed (J/Th Hardware maintenance cost Cooling cost (lots of miners generate an insane amount of heat) Bulk discounts or buying used hardware All the correlations between these variables

0 views
flowtwo.io 10 months ago

The Master & Margarita

Follow me, reader! Who told you that there is no true, eternal, and faithful love in the world! May the liar have his foul tongue cut out! Follow me, my reader, and only me, and I will show you such a love! — Mikhail Bulgakov, The Master and Margarita pg. 180 I need to start doing research on translations for foreign language novels before reading them. If it's been translated more than once, which is the case with a lot of classic literature, there will certainly be differences. The translator's choice of words, sentence structure, and overall prose will affect the final text greatly. Translating fiction is an art form all on it's own. It isn't as simple as making a source text understandable in another language. There's a lot of nuance in how something is expressed in different languages. There's often a fundamental trade-off between mirroring the exact words and expressing the meaning of the words. Some of this I learned while reading this article recently on translating the works of Homer. I found it quite fascinating. For The Master & Margarita , I read Mirra Ginsburg's translation, which I learned later is actually incomplete. Ginsberg's translation is based on the Soviet censored version of the novel, published in Moscow magazine in 1966. Ginsburg's translation was completed only a year later; it was the first English translation available. Since then, there have been several other translations published —and varying opinions on the merits of each. But the fact of the matter remains that the Ginsburg translation is based on an edited version of Mikhail Bulgakov's original text. If I could go back in time, I'd do more research and choose a more recent, and more complete, translation. But barring my time machine becoming operational anytime soon, I'll more likely just re-read The Master & Margarita some day, because I really liked it. I'll try a different translation next time. The Master & Margarita is a story of three narratives. The first concerns the events in Moscow in the 1930's when the Devil descends upon it. The second is the story of the protagonists—the "Master" and his lover, Margarita. The third narrative takes place in Jerusalem during the time of Christ, centering on the decision of his death by the bureaucratic governor, Pontius Pilate. All three of these storylines are interwoven throughout the book. Bulgakov eventually ties them together to form a poignant ending to the book. What I loved about The Master & Margarita was reading about the absurdity and chaos that befalls Moscow when the Devil comes to visit. I found it really funny. In the story, this "incarnation" of the devil calls himself Woland, which is an ancient German word for the Devil and a reference from Goethe's Faust . Woland and his troupe of hellish bandits inflict all kinds of evil on the citizens of Moscow. They have a flair for the dramatic too. Between the black magic act at the Variety Theater, Satan's Ball, and the curiously apropos punishments inflicted on their victims, the troupe knew how to put on a show. But Woland's acts of evil were not for the sake of evil. Bulgakov did not paint the Devil as a one dimensional character, a classic and singular force of cruelty and destruction. Instead, Bulgakov sought to depict evil as a counter-force that balances the world and strives for justice in its own way. Woland certainly punished people for their flaws, but he also rewarded them for good deeds too despite their flaws. For instance, Margarita's selflessness at Satan's Ball was rewarded by Woland with the return of her lover, the Master, and his lost manuscript. Bulgakov wanted to depict "evil" as a necessary part of existence. Maybe he was drawn to the idea of opposing, balancing forces because it's so antithetical to the one-party state and the centralized control of a dictatorship. There is a need for power to be checked, because what is "good" and what is "evil" are both subjective and they deserve fair representation. what would your good be doing if there were no evil, and what would the earth look like if shadows disappeared from it? After all, shadows are cast by objects and people. There is the shadow of my sword. But there are also shadows of trees and living creatures. Would you like to denude the earth of all the trees and all the living beings in order to satisfy your fantasy of rejoicing in the naked light? You are a fool.” — Bulgakov, pg. 280 Why did Mikhail Bulgakov write a story about the Devil coming to Moscow during the Soviet era? Bulgakov lived his whole life under the Russian empire. He witnessed the Russian Civil War as a young man, and lived the remainder of his life in Moscow under the totalitarian regimes of Vladimir Lenin, and then Joseph Stalin. These were times of great social upheaval and transfers of power. As a result, there was vast corruption and poverty exposed in the years that followed. Bulgakov did not have an easy life, at least professionally. This book felt like a lifetime of dissatisfaction and repression, expressed in a darkly comedic way. For most of his life, Bulgakov was a prisoner in his own country—denied the ability to write freely and also denied the chance to leave. He constantly battled the establishment due to the inappropriate subject matter of his art, which was often critical of the government and anti-communist. Due to his controversial writing, he was barred from Russia's publishing industry and unable to release any of his novels. You can feel his disdain for the literary establishment throughout The Master and Margarita . Bulgakov poured all his frustration and disappointment into The Master and Margarita through the story of the Master, a semi-biographical character. In the story, the Master is a writer who's laughed out of the literary community for attempting to write a book about Pontius Pilate. He is crushed by this, and despite the reassurance and love of his mistress, Margarita, he throws the manuscript for his book into a fire during a fit of depression one night. This is based on Bulgakov himself throwing the original manuscript for The Master & Margarita into a fire in 1930, due to his certain doubt of ever being published. In the story, the Master is saved by Margarita, who uses her wish granted by Woland to return the manuscript to them from the ashes. And of course, this sort of miracle was easy enough for the Devil to perform. Woland's assurance that "manuscripts don’t burn" became a famous phrase in Russia after the publishing of this book. It's a defiant proclamation for the indestructible nature of art, freedom of expression, and our collective desire to preserve these things. “Forgive me, but I won’t believe it,” said Woland. “This cannot be, manuscripts don’t burn. ” He turned to Behemoth and said: “Come on, Behemoth, let us have the novel.” — Bulgakov, pg. 229 Unfortunately for Bulgakov, he had to completely rewrite the story from memory. But thankfully he did, because it became one of the most acclaimed novels of the 20th century. The Master & Margarita is celebrated because it's truly a piece of art. Bulgakov used his art as a response, an expression of emotion, and a reflection of the world. Living within the Soviet Union was hard, depressing, and dangerous. As a citizen, the juxtaposition between the propagandist messaging of a glorious country and the sad reality of daily life would be embittering to say the least. The irony of a Marxist revolution, which promised a Utopian society, and the corrupted execution of this dream was a harsh truth that was impossible for anyone to talk about. Living through all this, Mikhail Bulgakov sought to find the words to express this suffering and confusion. Through the use of allegory, he produced a remarkable story that combines religion, philosophy, autobiography, governmental critique, and comedy. It's a dark, subdued comedy about madness and the human desire for control, even in the face of the supernatural. Unfortunately, Bulgakov was not around by the time The Master and Margarita was published. It was published by his widow (not named Margarita) 26 years after his death. And even then, it had to be published in a censored form in Moscow magazine. It took many years before the novel, in it's original form, was published, translated, and available for the world to experience. Bulgakov lived a tragic life, and he suffered an early death, but by the grace of God, or possibly the Devil, his most important manuscript didn't burn. Gods, gods! How sad the evening earth! How mysterious the mists over the bogs! Whoever has wandered in these mists, whoever suffered deeply before death, whoever flew over this earth burdened beyond human strength knows it. The weary one knows it. And he leaves without regret the mists of the earth, its swamps and rivers, and yields himself with an easy heart to the hands of death, knowing that it alone can bring surcease. — Bulgakov, pg. 290

0 views
flowtwo.io 10 months ago

Building a New Blog Pt. 2 - Streamlining Writing

My new blog is officially online. If you're reading this, you're on it. I'm still working on some minor enhancements and I'm currently in the process of migrating all my old posts to this new site. In the meantime, I wanted to describe the main improvements I've made to the blog, including one of the primary goals of this project: streamlining my writing and publishing process. I needed to reduce the friction involved with writing, formatting, and publishing new content. My old blog had too many weird and manual steps involved. Let me explain that a bit first. Here's what the process used to look like for adding a new post to my blog: The first step, writing, involved two different markdown editors on each of my primary devices—phone and laptop. My phone is iOS and my laptop runs Ubuntu Linux, so finding a single app that had a client for both systems was hard to find. So hard in fact, that I didn't find one when I was setting this up in 2018. As a result, if I wanted to work on something across devices, I'd have to manually log in to iCloud on my laptop and download the latest version of my article, load it into Ghostwriter , and then remember to re-upload the updated content after I was done so it'd be available on my phone. When I was ready to publish something, I had to open my laptop and use Postman to publish it via a REST API. Then I would SFTP into the server to upload all the media files for the blog post. To convert the markdown files into HTML, I had two choices—either I could convert them during upload and save the HTML content into the database, or save the markdown in the database and convert it into HTML at request time. I decided on the former, which I regret in retrospect. Being able to save and edit the HTML directly via the admin page was good for quickly changing and formatting things, but it led to the published content gradually drifting away from the source markdown. And when I made changes to markdown conversion engine, there was no way to re-process existing posts since I'd made edits that only lived in the HTML version of the files. Basically, there was no single source of truth for my content. When I decided to rewrite my blog, improving this process was a top priority. I wanted to make writing and publishing as easy as possible. While writing something, it's important that I can work from multiple devices and have changes synced between all of them. That way I can write whenever and wherever I want—on my phone on the subway, on my work laptop at the office, on my personal laptop at a cafe, etc. If my devices were all Apple, cross-device support and sync would be a lot easier. Something like iA Writer is a popular choice on the App Store. Finding something cross-platform is a bit more difficult. Using a web-based tool like Notion or Google Docs are options, but I don't want to deal with editors in the browser. The UI is always clunky and most browser apps don't have good offline support. My research led me to Obsidian —a markdown based writing tool with end-to-end encrypted server sync and native apps for both iPhone and Linux. It's a well polished, no frills product and you can optionally pay for cloud sync for an annual fee. It also has plugin support and an active developer community. There was even an existing plugin that allows me to import annotations directly from my Kobo; just one more part of my writing process that is easier now. Speaking of plugins, I wrote one for publishing directly to my blog right from Obsidian. It's on my GitHub if you're interested. So with this installed on all my devices, I can publish new posts directly from the Obsidian app. This allows me to do something I've never been able to do before: Write, format, and publish new content all from my phone—with ease. It's great. Putting it all together, here's my new writing flow for this blog: So much easier. There's an important piece of functionality that's critical for making this whole process work: ensuring markdown is all I need for formatting posts. If I need to go in and edit the HTML afterwards, this seamless publishing process wouldn't work. I want the files in Obsidian to be the single source of truth for my blog content. To accomplish this, I had to tweak the Markdown-to-HTML processing engine to output exactly what I wanted the DOM to look like. Afterwards, I could adjust the site's CSS so everything is styled accordingly. I'll go over these details in Part 3 of this series. Thanks for reading!

0 views
flowtwo.io 1 years ago

Handling Multipart Requests with Ktor

I'm building a new blog site with Ktor , a Kotlin-native HTTP framework. It's an asynchronous framework built on top of Kotlin coroutines. So far I've liked it's simple, unopinionated interfaces that allow you to structure your application however you want. Ktor is configured via a builder pattern, specifically a type-safe builder enabled by Kotlin's lambda functions with receiver object syntax . The code looks a little peculiar on first glance, but it makes sense once you understand the implicit receiver type declared in all the builders of the library (and associated plugins). It's basically like an explicit, compile-time declared context. It enforces the scope of all the code you write. Combined with Kotlin's extension function syntax, you can end up with really clean, declarative code. Anyway, one issue I was dealing with lately was trying to get multiple files uploaded at once, in a single multipart request. This is how the Ktor documentation shows to how to do it: So I'm trying to do the same thing, but the difference in my situation is that I need information from the to process each file upload. So in the interim, I store each in a list to be processed afterwards. This is my code: But for some reason, the files keep getting saved with 0 bytes: What's happening here? I see in the network request from the browser that the file data is being sent correctly with the request, so something funky is happening when trying to save the files to disk. First step of debugging is always adding some logs. So I add a log in the loop: Which spits out: OK, so is clearly empty. If I change my code to execute the same way the documentation example works, the files get saved successfully! Of course, in place of the argument for , I use a dummy value in my test. But this works. So the question is, why can I read the file data outside of this lambda? My original code keeps a reference to the and then loops through them later. Why doesn't that work? It's because returns a object. Let's look at the source code for it's method: So under the hood, it's converting the multipart data stream into a Flow and calling on that Flow with the supplied lambda function. Flows are a native feature of the Kotlin Coroutine library. Flows are asynchronous data streaming primitive to the language. A cold stream apparently. They operate like generators, but combined with coroutines they enable really suspendable code. Digging into the Ktor source code , I find the problem is in : As part of reading the next part, the previous part's function is called. This discards the body content of the previous part and allows the memory to be reclaimed. So, for my case I had to modify my code to read each file part's data as it's emitted and store that in memory before I can save it: Of course, I could also change the API signature to retrieve the post ID in the upload endpoint's path parameters instead, so I could actually save each file to disk as it's processed. Needless optimization so i'll save it for another day.

0 views
flowtwo.io 1 years ago

Site Reliability Engineering

After all, people regularly use www.google.com to check if their Internet connection is set up correctly. — JC van Winkel, Site Reliability Engineering pg. 25 Site Reliability Engineering is a collection of essays written by senior engineers at Google describing how they run their production systems. It's mostly within the context of the Site Reliability Engineering (SRE) organization, which is an actual job title there. However, I found the subject matter to be quite wide-ranging, everything from people management to distributed consensus algorithms. It didn't focus strictly on the SRE discipline, which partly explains why it's 500 pages long. The whole book is actually available for free online if you're interested in reading it. Or just parts of it, since each chapter is a separate topic and there's not much overlap between them. In essence, the SRE organization is a specialized discipline within Google meant to promote and maintain system-wide reliability for their services and infrastructure. Reliability is such a multi-faceted objective that the expertise and responsibilities required are wide-ranging. The end goal seems simple to explain: Ensure systems are operating as intended. But reaching that goal requires a combination of technical, operational, and organizational objectives. As a result, this book touches on basically every topic of interest for a software company. I spent a couple years working in a Nuclear Power Plant, so I've seen what peak reliability and safety culture looks like. The consequences of errors there are so much higher compared to most other companies, including Google. So it's not a surprise that reliability and safety are the paramount objectives and they take priority over everything else. This safety culture permeated everything we did within the plant, and around it. There were lines painted on each row of the parking lot to indicate the path to the entrance. If you didn't follow them, you would get coached and written up by someone. It was intense. And don't even think about not holding the railing while taking the stairs either... Any changes you want to make to a system within the plant needs extensive documentation, review, and planning before being approved. Thus, the turnaround on any change takes months, if not longer. Contrast that with software companies like Google where 1000s of changes are made on a daily basis. The consequences of a mistake can still be serious, depending on the application. But instead of aiming for no errors, errors are managed like a budget, and the rate at which this budget is spent determines how much change can be made in a given period of time : In effect, the product development team becomes self-policing. They know the budget and can manage their own risk. (Of course, this outcome relies on an SRE team having the authority to actually stop launches if the SLO is broken). — Marc Alvidrez, pg. 51 Learning about Google's software development process was interesting. In the first few chapters, there was a lot of useful information on measuring risk, monitoring, alerting, and eliminating toil. These were some of the more insightful chapters in my opinion. But there were also a few...less insightful chapters. Chapter 17 was about testing code and it was really just stating obvious things about writing tests; it wasn't specific to SRE at all. Then there was a lot of time spent on organizational stuff, like postmortem culture and how to have effective meetings. So much of the writing came off as anecdotal and rather useless advice that the author tried to generalize (or just make up) from past experiences. So there were good and bad parts of the book. I wouldn't recommend reading it cover to cover like I did. It'd be better to just read a chapter on a topic that's relevant for you. For instance, I found the section on load balancing to be really informative. Below is a summary of how Google does load balancing. Chapters 19 and 20 are about how Google handles their global traffic ingress. Google, by operating one of the largest distributed software systems in the world, definitely knows a thing or two about traffic load balancing. Or to put it in their words: Google’s production environment is—by some measures—one of the most complex machines humanity has ever built. — Dave Helstroom, pg. 216 Melodrama aside, I appreciated the clear and concise breakdown of their networking and traffic management in these chapters. Load balancing needs to consider multiple measures of quality. Latency, throughput, and reliability are all important and are prioritized differently based on the type of request. Chapter 19 is about load balancing across datacenters. Google runs globally replicated systems, so figuring out which datacenter to send a particular request to is the first step in traffic management. The main mechanism for configuring this is via DNS — a.k.a the phone book of the internet. The goals of this routing layer are twofold: - Balance traffic across servers and deployment regions fairly - Provide optimal latency for users DNS responses can include multiple IP addresses for a single domain name, which is standard practice. This provides a rudimentary way of distributing traffic, as well as increasing service availability for clients. Most clients (i.e. browsers) will automatically retry requests to different records in the DNS response until they successfully connect to something. The downside is that the service provider, Google, has little control over which IP address actually gets chosen in the DNS response. So it can't be solely relied on to distribute traffic. The second goal of DNS is to provide optimal latency to users, which means trying to route their requests to the geographically closest server available to them. This is accomplished by having different DNS name-servers set up in each region Google operates in, and then using anycast routing to ensure the client connects to the closest one. The DNS server can then serve a response tailored to that region. This sounds great in theory, but in practice DNS resolution is more hairy and there's lots of issues specifically around the caching introduced by intermediary name-servers. I won't go into those details here. Despite all of these problems, DNS is still the simplest and most effective way to balance load before the user’s connection even starts. On the other hand, it should be clear that load balancing with DNS on its own is not sufficient. — Piotr Lewandowski, pg 240 The second layer of load balancing happens at the "front door" to the datacenter—using a Network Load Balancer (NLB), also known as a reverse proxy. These handle all incoming requests by broadcasting a Virtual IP (VIP) address. Then it can proxy incoming requests to any number of actual application servers. In order retain the originating client details after proxying a request, Google uses Generic Routing Encapsulation (GRE) which wraps the entire IP packet in another IP packet. There's some complexity here, of course, in terms of the actual routing algorithm used by the NLB. Supporting stateful protocols like WebSockets requires the NLB to keep track of connections and forward all requests to the same backend for a given client session. Once the request has reached an application server, there will likely be a multitude of internal requests initiated in order to serve the request. In order to produce the response payloads, these applications often use these same algorithms in turn, to communicate with the infrastructure or complementary services they depend on. Sometimes the stack of dependencies can get relatively deep, where a single incoming HTTP request can trigger a long transitive chain of dependent requests to several systems, potentially with high fan-out at various points. — Alejandro Forero Cuervo, pg. 243 And besides that, there's plenty of requests and computational work that aren't originated by end-users. Cronjobs, batch processes, queue workers, internal tooling, machine learning pipelines, and more are all different forms of load that must be balanced within a network. That's what Chapter 20 covers. The goal of internal load balancing is mostly the same as for external requests. Latency is still important, but the main focus is on optimizing compute and distributing work as efficiently as possible. Since there's only so much actual CPU capacity available, it's vital to ensure load is distributed as evenly as possible to prevent bottlenecks or the system falling over due to a single overloaded service. Within Google, SRE has established a distinction between "backend tasks" and "client tasks" in their system architecture: We call these processes  backend tasks  (or just  backends ). Other tasks, known as  client tasks , hold connections to the backend tasks. For each incoming query, a client task must decide which backend task should handle the query. — Cuervo, pg. 243 Each backend task or service can be composed of 100s or 1000s of processes in a single machine. Ideally, all backend tasks operate at the same capacity and the total wasted CPU is minimized. The client tasks will hold persistent connections to the backend tasks in a local connection pool. Due to the scale of these services, it would be inefficient for every single client to hold a connection to every single backend task, because connections cost memory and CPU to maintain. So Google's job is to optimize an overlapping subset problem—which subset of backend tasks should each client connect to in order to evenly spread out work. Using random subsetting didn't work. The graph below shows the worst backend is only 63% utilized and the most is 121% utilized. Instead, Google uses deterministic subsetting which perfectly balances the connections between clients. It's an algorithm that shuffles and assigns backends to each subset evenly. Again, I won't go into detail about it. Once the pool of connections has been established for each client task, the final step is to build an effective load balancing policy for these backends. Using a simple round robin algorithm didn't work, as evidenced from historical operational data. The main reason is because different clients will issue requests to the same backends at vastly different rates, since they could be serving completely different downstream applications. There may also be variation in the cost of different queries, backend machine diversity, and unpredictable factors like antagonistic neighbours. Instead, Google uses weighted round robin which keeps track of each backend's current load and distributes work based on that. First they built it based on active requests to each backend, but this also doesn't tell the whole story of how healthy a particular backend is. So instead, each backend sends load information to the client in every response. It includes active request count, CPU, and memory utilization. The client uses this data to distribute the flow of work optimally. Here's a crappy diagram I made to visualize everything. Site Reliability Engineering offers many insights shared by senior engineers from one of the world's leading software companies. I particularly enjoyed the sections on alerting, load balancing, and distributed computing. But there were some chapters I found boring and without much useful, actionable advice. Google has been a leader and innovator in tech for many years. They're known for building internal tools for basically every part of the production software stack and development life cycle. A lot of these tools have been re-released as open source libraries , or even new companies started by ex-Googlers. For instance, Google has been running containerized applications for over 20 years. As the scale of running services and jobs this way expanded, the manual orchestration and automation scripts used to administer these applications became unwieldy. Thus, around 2004 Google built Borg — a cluster operating system which abstracted these jobs away from physical machines and allowed for remote cluster management via an API. And then 10 years later, Google announced Kubernetes, the open source successor to Borg . Today, Kubernetes is the de-facto standard for container orchestration in the software industry. All this to say—Google has encountered many unique problems over the years due to its sheer complexity and unprecedented scale; it's forced the company to develop novel solutions. As such, it's helpful looking to them as a benchmark for the entire software industry. Understanding how they maintain their software systems is helpful for anyone looking to improve their own.

0 views
flowtwo.io 1 years ago

Gains & Losses

I mostly stay away from individual stock investing. It's too risky, and I don't pretend to know better than the rest of the financial market. My " dumb money " mostly goes into broad market ETFs with low management fees. However, sometimes you just want to try your luck at the casino. That's how I felt last November when I had a lump sum of money I needed to invest. Instead of putting all of it into a low-risk ETF, I decided to do a bit of research and choose some other investment options. Now, I'll admit I didn't pull up any of these company's financial records. I didn't do fundamental analysis. But I read some things—not just /r/wallstreetbets—and chose some stocks. So my choices were based on some data, but mostly vibes. It's been exactly 12 months since I invested this money. I wanted to look back and see how these investment choices panned out. Of course, 1 year isn't super long in the grand scheme of market cycles, but it's a useful exercise nonetheless. This also gives me a chance to do some data visualization, which I haven't done much before. I like Kotlin, so I decided to try out Kotlin Notebooks combined with Kandy , a graphing library natively supported by Kotlin Notebooks and Jetbrains IDEs. The complete code is on my GitHub if you're interested. The actual dollar amount isn't relevant for this discussion, so let's just refer to it as here. Here's the breakdown of how I invested the dollars. Let's visualize this breakdown: So, I still put most of the money in ETFs, but only has broad market exposure. And is a bond ETF, which is generally considered a hedge against market downturns. It's pretty straightforward to get historical price data for all these securities from the Nasdaq website . I downloaded the last 5 years of data for each of them, imported them into my Notebook, sanitized the data a bit, and then plotted the last 1.5 years on a graph: The result is shown below, with a dashed line indicating the 11/30/2023 purchase date: This gives a basic indication of how things went . did well. did not. But a better comparison is, of course, the relative change instead of the absolute price of the stock. In other words, I want to visualize my return on each investment instead. Side-note: I found out halfway through this exercise that the graphing library, Kandy, is very much in development still. Plotting multiple series of data on the same graph is not supported really. You have to do some hacky list concatenation which was really lame. Don't think I'll use Kandy again until they've improved this. This gives us a consistent comparison of each stock's performance since the buy date: Here we see is up 31% right now, while is down 20%. Everything else is somewhere in the middle. I want to analyze how my investment choices fared—not just what I bought but also how much I bought, relative to the total dollars I had. This is called the portfolio return . The formula to calculate it is straightforward since I'm only considering a single purchase date. You just multiply each investment's return by the relative amount invested and then sum them up. It's "weighting" the return by the percentage of your portfolio exposed to it, essentially. I plotted the overall portfolio return alongside each investment's portfolio weighted return: This shows me how most of the portfolio's gains were due to , which is not surprising. It also shows how and barely affected my overall portfolio since I only invested about 5% of in each. Let's also plot the portfolio return alongside the absolute returns of each stock. I think this is more illuminating: Here's where we really see how my investment choices balanced each other out in the aggregate. Fortunately, I still have a positive return after a year, mostly because the Nasdaq-100 (mostly tech) had a great year. But investing a quarter of my money in solar didn't pan out so well. It's not always sunny on Wall St. Finally, just to drive the stake into my wallet further, here's my portfolio return plotted against just : Overall, a difference of percentage points missed out on because of my decision to cosplay as a day trader for fun. To put that in perspective, if I had dollars to invest last year, I would've had an additional $2070 in my pocket today. That's like, half a Taylor Swift ticket. Oh well, lesson learned: Don't use this unfinished graphing library for data viz anymore. And probably stick to ETFs.

0 views
flowtwo.io 1 years ago

Why We Sleep

When awake, we see only a narrow set of all possible memory interrelationships. The opposite is true, however, when we enter the dream state and start looking through the other end of the memory-surveying telescope. Using that wide-angle dream lens, we can apprehend the full constellation of stored information and their diverse combinatorial possibilities — Matthew Walker, pg. 203 I have a hunch that most people who identify as "book readers" like to read before bed. In fact, I'm confident that's where a lot of readers get most of their reading done. However, I admit I have no evidence or statistics to back this up; mostly because I'm not a scientist and I haven't done any research. Welcome to my blog. Fortunately, there are people like Dr. Matthew Walker who IS a scientist and DOES do research before sharing their theories with the world. Walker is a professor of neuroscience at USC Berkeley and he has spent his career studying sleep. This book - Why We Sleep - is his magnum opus; a distillation of decades spent revealing the secrets of the strange, yet essential, nocturnal phase of our existence. It feels like an injustice to this book to sum it up by saying "sleep is good for you". I was astounded by the breadth of topics covered by Dr. Walker in regards to sleep and its effects on our body and health. Part 2 of the book is entitled "Why Should You Sleep" and I kid you not, the following is an incomplete list of the benefits that adequate sleep has been shown to promote: Yes even that last one. After reading the details of all the studies and methodologies behind proving these correlations, I felt the emotional impact of each lesson began to dull after awhile. Like hearing that the world's on fire every day in the news, being told that sleep is really good for you starts to get repetitive. So I think that Part 2 sort of drags on a bit. Thankfully, the remainder of Why we Sleep was more focused on how to get better sleep — and what the hell is going on when we dream! I found both these topics to be much more interesting. Like I mentioned before, I do a lot of reading in bed. This was both a good and bad book to read before sleeping. On the one hand, learning about all the enumerable ways that sleep is good for me was a great headspace to end my day in, allowing my mind to drift off and start to ride those rejuvenating REM and NREM brainwaves. On the other hand, for the days when I couldn't' sleep, or was going to sleep late, or was in an uncomfortable environment where I wasn't in control of my sleeping space — knowing the exact reasons why I couldn't sleep ( "ugh my hands and feet are too hot" ) was almost worse ( "agh I looked at a screen too recently" ) and tended to exacerbate my stress ( "shouldn't have had that green tea at 4PM" ). But alas, I truly believe that knowledge is powerful and ignorance is not blissful. I chose to read this book because I wanted to know more about sleep and now I do. I took away a lot of interesting information and useful tips. I think this knowledge will help me sleep better, but more importantly I now have random factoids to drop into conversations whenever they turn to sleep. In particular, I learned that humans naturally have a biphasic circadian rhythm , which is a really cool term to break out at parties. In English, it means we're biologically hardwired to nap once a day. That mid-afternoon lull you feel everyday turns out to be totally natural and not just because your lunch was a big bowl of cheesy gnocchi. Well, the gnocchi could be part of it honestly (reminder: not a scientist). Not only is it natural, a short afternoon nap is apparently healthy for you too. Walker points to several studies, some from his own lab, that have illuminated the subtle but measurable ways that a short afternoon nap is beneficial. Those who were awake throughout the day became progressively worse at learning, even though their ability to concentrate remained stable (determined by separate attention and response time tests). In contrast, those who napped did markedly better, and actually improved in their capacity to memorize facts. — Walker, pg. 102 From a longitudinal study of Greece and the decline of it's siesta culture over the late 20th century, there was a clear relationship between reduced naps and the risk of heart disease: However, those that abandoned regular siestas went on to suffer a 37 percent increased risk of death from heart disease across the six-year period, relative to those who maintained regular daytime naps. The effect was especially strong in workingmen, where the ensuing mortality risk of not napping increased by well over 60 percent. — Walker, pg. 69 Leading to the natural conclusion, in Walker's own words: From a prescription written long ago in our ancestral genetic code, the practice of natural biphasic sleep, and a healthy diet, appear to be the keys to a long-sustained life. — Walker, pg. 70 While reading about napping and all its great benefits in Why we Sleep , I was reminded of a headline I'd seen somewhere years ago that said something along the lines of: BREAKING: New Scientific Study by Scientists Shows Napping Causes Bad Health Things to Happen and You Might Die Sooner, According to Science That might not be verbatim, but you get the point. Now I'm not one to believe everything I read on the internet. But I do like to base my entire worldview on a subject according to a single headline I skimmed over once and never looked into further. So needless to say, finding out there was conflicting evidence on the benefits of napping was rather shocking. I was losing sleep over it. With such uncertainty circling in my head, I felt the need to do more investigation. So I decided to finally put on my scientist hat and do some good old fashioned research on the matter. There have indeed been several studies published over the last couple decades which associated napping with increased risk of hypertension , cardiovascular disease and diabetes , to name a few. According to that last one, the correlation with diabetes risk was "partly explained by adiposity". Adiposity is a really technical term for being fat; and being fat, interestingly enough, was also found to be correlated with regular daytime napping in a separate study ! This begs the question — what came first, the couch or the glucose intolerance? It's hard to say since these long-term epidemiological studies are observational and can only suggest potential causes of a disease based on statistical evidence. The risk of confounding factors makes these studies hard to trust entirely, which is why the conclusions drawn must use wording like: "increased daytime nap frequency may represent a potential causal risk factor for essential hypertension." For us normies, these studies lead to clickbait headlines and articles touting the risks of this seemingly benign activity: With all these morbidities being linked to napping, it's easy to see how one could conclude that naps are indeed bad for them. I haven't found any explanations offered as to why napping might be bad for your health; so my research, along with the entire state of nap science, is apparently stalled for now. But what does our sleep expert, Dr. Walker, say about all this? Walker makes no mention of any of these studies in Why We Sleep (granted, many of them were published after his book was written). Walker's only words of warning come in the Twelve Tips for Healthy Sleep Appendix: — Walker, pg. 325 So, according to Walker, naps are good for you as long as you take them early enough in the day. Most studies I read as part of my research mentioned nap length as an important factor in these negative health correlations. For example, this study showed those who regularly take short naps (< 30 minutes) were less likely to have high blood pressure, and those who take long naps were more likely to have high blood pressure. Putting it all together, I've decided my new worldview on naps boils down to the following: And of course, my final research source was asking ChatGPT, which obviously told me exactly the same thing: Why bother researching the internet when ChatGPT has already read the whole thing? Oh well, now I know for next time. Reading books is probably outdated by this point too, but it still helps me get to bed at night. Especially a book like Why We Sleep that evangelises the life-changing power of slumber. So until AI can start singing me lullabies*, I'm clinging to my books! **editor's note: turns out AI can already do this emotional regulation learning efficacy memory retention expected lifespan decreased psychiatric disorder risk decreased injury risk decreased cancer risk decreased Alzheimer's disease disk decreased type-2 diabetes risk decreased car crash risk increased testosterone levels increased testicle size Long Naps May Be Bad For Your Health | Forbes Napping regularly linked to high blood pressure and stroke, study finds | CNN You snooze, you lose: why long naps can be bad for your health | The Guardian Don’t take naps after 3 p.m. Naps can help make up for lost sleep, but late afternoon naps can make it harder to fall asleep at night. Napping early in the afternoon for no more than 30 minutes is: probably fine possibly beneficial for you Napping for longer than 30 minutes regularly is: possibly bad for you more likely a symptom of some other underlying health issue(s) or poor lifestyle habits.

0 views
flowtwo.io 1 years ago

Building a New Blog Pt.1

It's time to update my website. Over the last couple years I took a bit of a hiatus from posting new content, but this year I've rediscovered the motivation for writing. I'm also interested in writing about more subjects, not just book reviews. In particular, I'm going to start writing about software engineering and coding more, which will require some changes to the formatting of posts. Because of these new requirements, I've decided to rebuild my writing "stack" and the platform that powers my blog. The main things I don't like about my current writing flow and site: So there's nothing horribly wrong about the current site. It works. Which isn't surprising given it's a static blog that changes infrequently. But I think rebuilding the site will help invigorate my writing and improve my efficiency for generating new content. Secondly, any project is an opportunity to learn so I'm excite to work with some stuff I don't use often and try out some new technology. My goals for this new site (code named flow2 ) are the following: And that's it! Plus I'll probably get a new domain name too. Looking forward to building. You can check out my progess on Github if you're interested. p.s. this is my new blog The site's UI is outdated While I'm proud of the handcrafted *artisanal* HTML I wrote for the original site, I want to redesign it to match my current tastes There's too much manual HTML editing required for new posts I write posts in markdown and then convert them to HTML programmatically. But then I usually have to modify the HTML output to finalize the formatting. My deployment infrastructure is not cloud optimized or properly de-coupled. Sure, hosting your Spring Boot App, MongoDB server, media assets and Jenkins server on a single EC2 instance is possible . Is it a good idea? No. Refresh the UI Containerize and use better cloud tooling for the infrastructure Markdown files as the single source of truth for content. No manual HTML editing required Streamline the entire process between writing and posting Try out Ktor — an async web framework built in Kotlin with coroutines

0 views
flowtwo.io 1 years ago

Designing Data-Intensive Applications

If you’re used to writing software in the idealized mathematical perfection of a single computer, where the same operation always deterministically returns the same result, then moving to the messy physical reality of distributed systems can be a bit of a shock. — Kleppmann, pg. 343 As an engineer with a fair amount of professional experience, I found Designing Data-Intensive Applications by Martin Kleppmann to be extremely illuminating. It's a comprehensive overview of modern data storage and data processing technology, with a little bit of history thrown in. I was already familiar with many of the concepts and tools in the book but it was interesting to read about them in a broader context. Kleppmann takes time to describe the rationale behind each software framework or tool he introduces, including the problems it solved that existing technologies didn't and what trade-offs it makes. This helped improve my mental model and solidify my overall understanding. Kleppmann sticks to a particular topic in each chapter — concepts like data encoding, partitioning, or stream processing. As such, in theory this book could be read piece-wise; every chapter stands up on its own like a textbook. But Kleppmann did a good job organizing topics in the book so that reading it end-to-end is worthwhile too. The lessons of one chapter build upon the next in complimentary ways. The book is divided into three parts. Part I starts with some fundamental concepts of data systems as they relate to a single machine — how you model, store, and query for data. Then we move beyond a single computer in Part II and a new host of problems are introduced, like how to encode data, replicate it, partition it, and achieving consistency across a distributed system - one of the scarier words in software engineering. In the final section of the book, Kleppmann focuses on heterogeneous systems of data storage. How data is stored and transferred across disparate datastores, indexes, caches, streams and more, and some best practices for building such systems. All these topics require a familiarity with the topics covered in previous chapters to fully understand the intricacies of the problems being solved. You continuously zoom out to higher levels of abstraction as the book progresses. That's something I really liked about Designing Data-Intensive Applications . Next I will share 3 lessons that stood out to me after reading the book. Hopefully you will find something useful in my brief summaries. A large portion of this book is dedicated to databases, obviously. Chapter 2 covers how to model and store your data in different types of databases. In chapter 3, Kleppmann dives in to how databases are actually implemented, particularly relational / SQL-style databases. But he also introduces several other database classes, generally grouped under the umbrella term "NoSQL". Which just means not SQL. Document-oriented, wide-column, key-value, and graph databases are all alternatives meant for different use cases. Several of these databases are quite popular, for example MongoDB, a document database that stores unstructured JSON in named collections. I've always considered this type of database to be schemaless since each JSON document in a collection can have completely different fields. But Kleppmann explains that it should be considered schema-on-read instead. This is in contrast to a SQL database, which is schema-on-write because it enforces a predefined schema when you attempt to add or update records. Document databases are sometimes called schemaless, but that’s misleading, as the code that reads the data usually assumes some kind of structure—i.e., there is an implicit schema, but it is not enforced by the database [20]. A more accurate term is schema-on-read (the structure of the data is implicit, and only interpreted when the data is read) — Kleppmann, pg. 51 I think this is an important distinction to make, and makes sense when you think about it. Stored data has no use if it can't be parsed into some known set of fields, so of course at some point a schema needs to be "applied". Sure, you can add if-exists checks to every field to avoid making any assumptions, but the same thing could be done with SQL by making every field nullable: Schema-on-read is analogous to runtime type checking in programming languages, whereas schema-on-write is similar to compile-time type checking. The schema-on-read approach is advantageous if you have little control over the data your system is storing and it may change frequently over time. It's easier to change your schema if it's part of your application code. This is why MongoDB, and more broadly any schema-on-read approach, is generally considered more flexible and less error-prone as your application changes. Read-heavy systems are very common in software, especially web applications. I'm personally more accustomed to optimizing read efficiency, using techniques such as database indices. A database index will make writes slower (because they need to update the index) but reads much faster. But sometimes, the system you're designing needs to handle extremely high write throughput. In this case, we want to shift some of the work back to the read path in order to make the write path faster. For instance, in chapter 5 Kleppmann covers database replication. There is a style of database called Dynamo, which is a set techniques developed at AWS and implemented in their DynamoDB service. Other popular databases like Cassandra and Voldemort were modelled based on Dynamo. These databases are leaderless and use asynchronous consistency to achieve high write throughput. One technique that stood out to me is called read repair: When a client makes a read from several nodes in parallel, it can detect any stale responses. For example, in Figure 5-10, user 2345 gets a version 6 value from replica 3 and a version 7 value from replicas 1 and 2. The client sees that replica 3 has a stale value and writes the newer value back to that replica. — Kleppmann, pg. 199 Because Dynamo-style databases use quorums for consistency between nodes, the read operation also updates any outdated values found when those nodes responsible for a given key are queried. The coordinator node will "repair" those outdated values synchronously before completing the read operation. This is an example of doing more work during reads in order to make writes faster. Another more general principle for increasing write throughput is by storing data as an immutable log of append-only records. Writing to a single ledger is relatively fast computationally and can be scaled in many ways. This technique is touched on countless times by Kleppmann throughout the book. Change data capture via the binlog, event sourcing, batch and stream processing — these are all examples of systems that write data as an append-only log available for downstream consumers to parse and derive the relevant data needed for their use case. There is more work needed when reading (or by intermediary systems to construct the data in an appropriate form), to the benefit of allowing very high rates of write operations. One concept that Kleppmann spends a great deal of ...* ahem *... time on is dealing with time in a distributed system. It turns out to be a tricky business, some may even go so far as to call time an illusion . Having a notion of time is critical for so many algorithms and mechanisms in software applications, so understanding the edge cases and complexities of it are important. Kleppmann spends a good portion of chapter 8 just discussing the myriad different ways that assumptions about time can be wrong. Even for a single computer, time is an illusion. There are generally two different clocks available for you to mess up use. The time-of-day clock represents the "real-world" time and is used when you need a timestamp, while the monotonic clock is based on an arbitrary time, but is guaranteed to always increase (at approximately 1 second per second). You use the monotonic clock when you want to measure the duration between two events. Once multiple machines are involved, simple questions like "what time is it" and "what happened first" become deceptively hard to answer. For instance, a common way of dealing with write conflicts is via the "Last Write Wins" mechanism. When two machines want to modify the same record at the same time, just choose the write that happened later . The problem is, how do you determine which write happened last? If you use the time you received the writes, then you risk violating causality since the write could've been delayed for any number of reasons. If the clients generate timestamps themselves, you suddenly need to deal with differences in their local clocks. If a node has a lagging clock, then all its writes might be overwritten for awhile before it gets noticed. So, to make sure all the nodes in our system have the right time, we use the Network Time Protocol (NTP) to periodically synchronize all the clocks using a super accurate time source. But, like any network communication, NTP is also susceptible to a number of fault modes. I won't detail them here. Leap seconds are another good example of time's illusory nature. Leap seconds have crashed entire systems before . Nowadays leap seconds are handled by having the NTP server "lie" via a process called smearing which gradually applies the extra second over an entire day. If you can't trust your NTP server who can you really trust? I think the complexity of time is emblematic of distributed systems as a whole. You can only reason so much about how a system of computers will behave and every possible way things can go wrong before you start diving into the limits of quantum physics and the notion of truth itself! It can be overwhelming! Fortunately, we don’t need to go as far as figuring out the meaning of life. In a distributed system, we can state the assumptions we are making about the behavior (the system model) and design the actual system in such a way that it meets those assumptions. Algorithms can be proved to function correctly within a certain system model. This means that reliable behavior is achievable, even if the underlying system model provides very few guarantees. — Kleppmann, pg. 330 Comforting words Martin. The last chapter of the book is a divergence from the rest. It's still technical, but it's a forward facing look at how data system technology might evolve in the future. Kleppmann shares his personal views on what makes good system design and how to best leverage different tools and paradigms for today's data needs. In particular, Kleppmann makes the argument that maintaining data synchronicity between systems is best achieved through log-based derived data vs. trying to implement distributed transactions: In the absence of widespread support for a good distributed transaction protocol, I believe that log-based derived data is the most promising approach for integrating different data systems. — Kleppmann, pg. 542 In other words, datastores should broadcast changes in a durable and persistent way for other systems to consume at their own pace. This naturally entails dealing with eventual consistency when reading from derived sources, but Kleppmann believes that's a more manageable problem to solve compared to the performance and complexity concerns of coordinating synchronous data updates. From everything I learned by reading this book, I understand why he believes that. For example, some "classic" data constraints such as "two people can't book the same hotel room" might not need be so inviolable. Fixing those conflicts asynchronously and issuing a notification to the inconvenienced user (plus a coupon or some other small compensation) is an acceptable trade-off your business could consider. The book ends with Kleppmann examining, with a broader societal perspective, the consequences of these enormous data processing systems that have been built over the last 20 years. The data involved in the majority of these systems is data about us, the users. Privacy and data consent concerns are paramount questions to wrangle with as these systems get better and more accurate. I've previously written about the current state of consumer privacy in my review of The Age of Surveillance Capitalism , so I won't go into more detail here. Kleppmann also talks about predictive analytics and the world of data science. These days, a machine learning model is almost always going to be one of the consumers of a data-intensive application. Machine learning models usually provide automated decision making or predictions. Kleppmann ponders how accountability will be assessed in this world. For instance, he shares his thoughts on bias within data: it seems ridiculous to believe that an algorithm could somehow take biased data as input and produce fair and impartial output from it. Yet this belief often seems to be implied by proponents of data-driven decision making, an attitude that has been satirized as “machine learning is like money laundering for bias”. — Kleppmann, pg. 590 Part of our role as engineers is to have "moral imagination", as Kleppmann puts it - and a desire for the systems we build to improve upon the past. These are all novel issues we are encountering in the information age and they have broad societal implications. Engineers have a big role to play in helping to improve the technology and algorithms underpinning the software that runs our lives.

0 views
flowtwo.io 3 years ago

Stuff I Dug in 2021

Screamin' at home, and at phones, we all hurtin' Freakin' ya soul, like the pack, we all herb Pay me to feel with the funk, we all need ya Mixin' the feel with the facts, we all hurt — Isaiah Rashad, All Herb I like stuff. I decided to recognize the stuff that I thought was the best stuff out of all the stuff I liked last year. I did this in 2020 and I did it again this year. This is a list of my favourite music, film, and TV shows from 2021. I immediately fell in love with this song the first time I heard it. Slowthai is a new artist for me; he uses a blend of hip-hop with more familiar British (his home country) genres like grime and punk. He experiments with all of these sounds on his 2021 album Tyron , of which feel away is the 2nd last song. Feel away sounds like slowthai's approach to a love song. In this case, it's like the love is lost already. The piano refrain is melancholic and ethereal; echoing away throughout the whole song. It evokes the feeling of losing something, like something is slipping away. The bridge by James Blake in the 2nd half of the song is beautiful and haunting. As the beat breaks back in, Blake sings: I'll leave the dent in my car To remind me what I could have lost — James Blake, feel away Again, it's about losing something. The aftermath of having something important taken from you. It could be a relationship, a friend, a pet, or your favourite coffee shop closing down due to economic hardship. No matter what it is, songs like this have a way of reminding us about these things. Even the outro by Mount Kimbie is really cool and fits perfectly with the song. Just an excellent composition all around. Shoutout to Cautious Clay who made this a tough decision with the release of Deadpan Love this year. It's an amazing record and was definitely a close second. Isaiah Rashad hasn't released an album since 2016. He's barely released any new music in that time. I first heard him via Civilia Demo , his EP from 2012. It's an incredible debut album. I instantly fell in love with his melodic, laid back style of hip-hop. His cadence is finely tuned and so easy to listen to. Rashad doesnt place much importance on enunciation...but if this is mumble rap then it's the best mumble rap out there. The House is Burning is so well done. Rashad has perfected his sound on this album. Each track is different but as a body of work its got everything. Rashad can carry a song by himself, but he selected some great featured artists for most tracks on the album. A good example is Lil Uzi Vert's verse on From the Garden , one of the album's standout tracks. My favourite track is RIP Young though. It's got a beat that needs to go an a diet and a really catchy chorus. It also showcases Rashad's impressive lyricism. He's not easy to understand sometimes but his verses are pure poetry. The Blue Stones are a two-piece rock band hailing from Windsor, Canada. With only a guitar, drums, and two mouths, The Blue Stones manage to make some really dynamic, catchy rock music. They released their sophomore album Hidden Gems in 2021 and it was one of my most played albums last year. It's a follow up to their first studio album from 2018, Black Holes . I'm giving them my artist of the year because both albums are stellar. I think it'd be easy to criticize their sound as repetitive. Every song from this new album would've fit perfectly fine on the last one. Even though their sound hasn't evolved significantly, there's something to be said for consistency. I was happy to get 10 more tracks from a band I already loved with the release of Hidden Gems . If they don't do anything different on their next album though...might be time to add a 3rd band member. I'm not sure what the rules of my annual awards list should be. I technically watched Sound of Metal for the first time in 2020. But I've watched it a bunch of times since then, including this year. And yesterday. Sound of Metal is a superb film. It's about a recovering heroin addict named Ruben who plays drums in a metal band with his girlfriend. Ruben, played by Riz Ahmed, starts to lose his hearing and he has to figure out a way to cope with this burgeoning disability. Director Darius Marder did an amazing job bringing this original screenplay to life. Sound of Metal presents a realistic and eye-opening view of what deafness is like. Ahmed's performance was outstanding; he's become one of my favourite actors in recent years. MFW I literally go deaf The sound editing is incredible and adds so much to the experience. In fact, Sound of Metal won the Oscars for Best Editing and Best Sound last year. It was also nominated for Best Picture, which it definitely could've won. Coded Bias is a documentary about modern technology and the biases that are imbued within technology. In particular, it looks at the racial bias present in facial recognition. It also exlores how software and algorithms are being used more and more to make decisions about us across all aspects of life. It was really eye-opening. The stark difference in accuracy found in popular facial recognition services against women and people of colour was astounding. It's so surprising that major tech companies like Google and Amazon would release these services without checking for such obvious biases. The film was really well done. It explores a range of topics and has interviews with experts who are active in these debates about facial recognition, widespread surveillance, and algorithmic bias. Coded Bias was an illuminating film that I learned a lot from. It definitely made me think about how much control we are ceding to these algorithms and the people who develop them. I've loved every movie by Denis Villeneuve I've seen. Arrival , Sicario , and Blade Runner 2049 are all fantastic...great stories with great cinematography. His latest movie, Dune , was released back in October. It's the latest attempt to make a film adaptation for a book that's been notoriously hard to adapt. I think Villeneuve did a decent job all things considered. Dune was a visually stunning movie, especially in IMAX. Hans Zimmer also awed audiences with his goose use Netflix has produced some pretty fantastic shows and movies the past few years. The Serpent was a co-production between Netflix and the BBC; it's an 8 part limited series released last April. It's based on the actual story of serial killer Charles Sobhraj, who drugged and killed tourists in Thailand during the 1970s. It's suspenseful, engaging and super creepy. Extra creepy when you remember that it actually happened. It's a must-watch for anyone that's not planning a trip to Southeast Asia anytime soon. I'm still not done The Sopranos but I'd be kidding myself to say this wasn't the best TV I've watched all year. I'm a little late to the party with this one (the final episode aired in June 2007) but it was better late than never. Despite the show's age, the themes and story-lines hold up surprisingly well today. It's a classic mob drama told with a modern lens, aware of the Godfathers and Goodfellas that came before it. Mixing mafia crime and family drama, The Sopranos is a show that finds deadpan humour embedded in its realism. But the show is propelled by the excellent casting and the performances of every lead character. James Gandolfini is Tony Soprano—he commands the role of a mafia boss succumbing to the pressures of his responsibilities. It's really entertaining TV. High Maintenance is a really unique show. It's kinda an anthology—each episode explores different lives of people living in New York City. It's a very modern, very progressive look at life today. The only thing that loosely ties the stories together is The Guy ...the nameless protagonist who bikes around the city delivering weed to the people in the show. Some episodes feature him more than others, but in general his life is not really the focus of the show. I loved the final episode of season 3, Cruise . It wasn't especially better than any of the other episodes, but it had a bit of everything. My favourite part is the last 10 minutes of the episode, which felt like an homage to bicycling in the city. It ends with The Guy biking home at night overlaid with a monologue from the famous poet and NYC tour guide, Speed Levitch . Levitch is also featured in a few different scenes in the episode. As an avid city biker myself, I appreciated the tribute. Biking through a busy downtown is an immersive mix of chaos and order. All your senses are saturated by the buzz of the city as you cruise through it. I suppose if I had an essential goal on the cruise right now, it would be to exhibit the fact that I'm thrilled to be alive and to still be respected. I suppose the soulful or the Buddhist out there might ask, 'Why do you need respect from others? The thrill to be alive, that's your own business. You can do that in your living room.' But that's not what the cruise is for me. The cruise is about the searching for everything worthwhile in existence. I mean, I will appreciate the beauty of a flower, and then likewise, I will stand exhibitionistic and have the flower appreciate the beauty of me. Well, that's how I feel about cruising right now. And I would say having a quote, unquote, 'intimate love affair' with a flower is far more psychotic and riveting than having an 'intimate love affair', quote, unquote, with some of the bana creatures of the human race. Although I'd be into that too. — Speed Levitch, The Cruise Cruising through 2021 was a ride in itself. But the cruise is about searching for everything worthwhile in existence. Let's keep searching.

0 views
flowtwo.io 4 years ago

All the Wrong Moves

Every year, I discover more and more, that I'm the same as everyone else. Which is kind of great, because it means that life is not so mysterious. You just do what other people do. Say please. Floss. When you're making scrambled eggs, stir them really fast so they don't get crusty. Find a few good people and try to hang on to them. Don't lose all your pieces. — Sasha Chapin, All the Wrong Moves , pg. 70 You know how some people hate movie trailers? Like, they'd rather watch movies without seeing the trailer first because it usually spoils a bunch of the plot. Well, I'm the same way with books. I like starting a book without knowing too much about it; pretty much for the same reason as those trailer-haters. I enjoy figuring out the story as I read it so I can make my own judgments. I find it more engaging, especially with fiction novels. The "to-read" list on my phone is compiled from multiple sources (the internet, friend's recommendations, Oprah, etc.) and it's getting pretty long. As a result, the time between adding a book and actually reading it is usually enough time to forget why I added it in the first place. I just trust that past Joe added it for a good reason. Usually works out for future Joe (who, of course, becomes present Joe at the time of reading) That's exactly what happened with All The Wrong Moves by Sasha Chapin. I forgot what I'd heard about it. I thought it was some dramatic tale about a chess player that went crazy like 100 years ago. It wasn't that at all. It was much better and far funnier than I expected. All The Wrong Moves is all about chess, kinda. Sasha Chapin is obsessed, but also repulsed, by the main subject of his memoir. And just like all good toxic relationships, he ends up with some wild stories from his time spent with the game. I thoroughly enjoyed the book. Chapin's sharp wit shines through on each page. Most of the humour stems from Chapin's acute perspective on life and how it's often stranger than fiction. When you view things through the right lens. In terms of subject matter, Chapin toes a fine line between funny, self-deprecating cynicism and profound observations about the human experience. Like describing his personal preference between an aggressive or defensive play style: I was like a child who couldn't draw a house with crayons deciding whether to be more like Jackson Pollock or Francis Bacon. — Chapin, pg. 63 And then, in the same breath (or whatever the written version of breathing is), Chapin will expound on the mystery of determinism: But it's so hard to tell, from the inside of a life, whether we can control our fate, or whether consciousness is merely the ability to observe ourselves obeying our irrevocable course, as if we were all self-aware pinballs — Chapin, pg. 100 This self-aware pinball found the writing absolutely hilarious. Chapin embeds humour into every subject in the book. At a rapid fire pace too—I would audibly laugh several times in between page turns. The timing and rhythm reminded me of stand-up comedy. I love how Chapin portrays chess as a character in his memoir. It was such an integral part of his life for so long that it felt like a person. But it wasn't his friend, it was the antagonist in All the Wrong Moves . Chess lures him in by its abstract beauty and illustrious history. Chess also feels like a world separate from our own—it occupies a higher plane within our minds. Chess can feel like an escape from the viscerality of life. Yet the world of chess is its own special form of hell for Chapin. He becomes consumed by his drive to conquer the game, to understand its inner workings and secret rhythms better than his opponents. This obsession takes him around the world; he sacrifices relationships, sleep, and his own health. Specifically, he wants to beat someone with an ELO rating above 2000 at a tournament in Los Angeles. Mostly because it's a nice round number. All The Wrong Moves is all about chess, but it's also not. Chess could be substituted by almost anything in this story, because Chapin isn't writing about it, he's writing about his relationship with it. I learned a lot from reading this book, or at least it expanded my views on many important concepts in life (funny how a book can do that). Obsession, conformance, following your passions, and dealing with the gradual realization that you aren't that special. At least from any reasonably zoomed out perspective. We all know your mom thinks you're special. Chapin fully admits, from the start, that his obsession with chess was unhealthy. It was absolutely not good for him and his well-being. From sleepless nights playing online chess with strangers, to the anxiety and stress he dealt with during tournaments, Chapin wasn't in control of this hobby. Chess was in control. What I found interesting was how self-aware Chapin was. Trying to resist the urge to play was futile, and he accepted that. He explains how chess entered his life in high school, when he joined the chess club. Despite some brief breaks from it, Chapin was consumed by the game for most of his 20s. Was this unrelenting pursuit of chess mastery Chapin's choice? It doesn't sound like it. Frankly, I didn't feel like I was doing much until chess came along. [...] it felt like a possession---like a spirit had slipped a long finger up through my spine, making me a marionette, pausing only briefly to ask, "you weren't doing anything with this , were you?" — Chapin, pg. 4 This fact, that Chapin never really had a choice about devoting himself to this game—it feels like the central point he was trying to address in this memoir. I really appreciate how Chapin leaned in to his obsession. He fueled his passion for chess for years until he could feel satisfied. Maybe not satisfied with the outcome, but satisfied with the effort he put in. So much of our lives are determined by what we're exposed to—the ebbs and flows of life around us. These are the tides that can push us out to sea. The question is whether you choose to sink, or learn to swim. Ultimately, we hold on to the belief that we control what we want to pursue in life. What we want to give ourselves to and become passionate about; the mountains we choose to climb. But maybe which mountain we choose isn't that important. It's about deciding to climb. Nature analogies aside—chess playing could be seen as one of the least useful skills to devote time to. It's just a game after all. But it's a great example of how applying yourself will change you, no matter what the application is. Chapin believes that no matter how it works out, you'll be a better person at the end of the day. Life often contains the discovery that your place in humanity isn't quite what you thought it was. You find out that you weren't meant to be the lover of the thing you first loved. But it's not so bad. If you're lucky, you end up loving something else. When failure removes you from the wrong path, as wrenching as that feels, you ought to be grateful. You're a little closer to where you should be, even if you don't know where that is yet. — Chapin, pg. 121

0 views
flowtwo.io 5 years ago

Range

After reading a few of these pop-social-science books, they all start to sound the same. It's a repetitive formula that becomes a chore to read through. Anecdote, explanation, anecdote, explanation, anecdote, explanation...It's just a bunch of stories that the author uses to enforce a vague, sweeping generalization about how the world works. Maybe with some "studies" thrown in to add some scientific credibility to the theory as well. Range by David Epstein falls directly into this category. I listened to Epstein's previous book, The Sports Gene , as an audiobook and I realize that it's the best way to consume books like this: passively. I find them tiring to read because you're constantly throwing away all the details of the story you just read when the author moves on to the next one. I should've known when I saw Malcolm Gladwell had a blurb on the book's front cover ("I Loved Range!"). Gladwell is the king of this style of writing. I don't have anything against this writing style per se , but I don't enjoy reading it anymore. It's hard to denounce authors like Gladwell and Epstein for trying to explain a sociological phenomenon in an accessible way for a general audience; it's easy reading and you can learn a few things too. I just think they're forgettable, boring, and too long. The thesis of this entire book could've been explained in a blog post with basically the same impact, if not more. If you could find some blog on the internet talking about Range instead, you could get a basic understanding of what it's about and end up with the same "knowledge" of how the world works as if you'd read all 339 pages! ...I guess I could be that blog. Mathematician Freeman Dyson^ 1 ^ explains what "range" means below: Birds fly high in the air and survey broad vistas of mathematics out to the far horizon," Dyson wrote in 2009. "They delight in concepts that unify our thinking and bring together diverse problems from different parts of the landscape. Frogs live in the mud below and see only the flowers that grow nearby. They delight in the details of particular objects, and they solve problems one at a time." As a mathematician, Dyson labeled himself a frog, but contended, "It is stupid to claim that birds are better than frogs because they see farther, or that frogs are better than birds because they see deeper." The world, he wrote, is both broad and deep. "We need birds and frogs working together to explore it. — David Epstein, Range pg. 161 Dyson was concerned that modern science was overflowing with frogs, like an invasive species . Frogs have strong expertise in one specialized area and they place all their focus in this area. Birds have a more generalized knowledge of many different areas and they understand inter-connections and overall systems better. David Epstein wrote Range to explain the benefits of being a bird, and how to become one. Epstein uses real-life examples to showcase the power of range and the pitfalls of over-specialization. From Roger Federer's late start in tennis to Van Gogh's lifetime of sucking at art, there are innumerable examples of people that have achieved great success in their fields despite having diverse backgrounds. Then there's Tiger Woods, who was hitting golf balls when he was two years old . Sometimes hyper-specialization works too. Like I mentioned above, I feel like the book was probably twice as long as it needed to be. Epstein extrapolates on this idea of "diversity = good" over many chapters; each one having a slightly different lesson. One chapter is about the origins of Nintendo and how they relied on creativity instead of worrying about having the best technology. Another is focused on the "outsider advantage"—examples of people who have made important breakthroughs in a field without having expertise in the field. Chapter 5 talks about German astronomer Johannes Kepler, who used analogies from different areas of life to try to explain the planet's movement: Kepler's short Mars assignment (he guessed it would take eight days) turned into five years of calculations trying to describe where Mars appeared in the sky at any given moment. — Epstein, pg. 95 Sounds like the work estimates in my sprint planning meetings. All these stories are definitely, in some way, related to Epstein's range concept. But there was a lot of pigeonholing going on. It seems like Epstein starts with an interesting anecdote and then figures out how a lesson about range or diversity can be extrapolated from it. This sort of post-hoc rationalization is a common critique of other social-science authors like Gladwell. The "glue" work that Epstein did to patch all these disparate stories together so that he could present a consistent narrative, felt forced. But on the bright-side, the stories themselves were cool and I learned some interesting things. So it's not all bad. Throughout Range , Epstein talks about how society is geared towards turning people into frogs, not birds. In other words, in life it's much easier to become highly-specialized at something than to develop wide-ranging knowledge and experience: there is often no entrenched interest fighting on the side of range, or of knowledge that must be slowly acquired. All forces align to incentivize a head start and early, narrow specialization, even if that is a poor long-term strategy. That is a problem, because another kind of knowledge, perhaps the most important of all, is necessarily slowly acquired—the kind that helps you match yourself to the right challenge in the first place. — Epstein, pg. 88 These incentives towards specialization start from our early childhood, as Epstein explains in Chapter 1. Parents are drawn towards stories like Tiger Woods', a child prodigy, so they believe if their child is to be successful, they must start practising as early as possible. Epstein calls this the "cult of the head start". Epstein also spends a lot of time critiquing the education system, like how teacher's teach and how we measure aptitude. It's a problem of misaligned incentives; we want to see students succeed and get questions right on tests, but we need it to happen quickly. It leads to non-durable learning techniques: Rather than letting students grapple with some confusion, teachers often responded to their solicitations with hint-giving that morphed a making-connections problem into a using-procedures one. — Epstein, pg. 67 "Using procedures" means memorization, and "making connections" refers to a deeper level of understanding of the concepts that produce the right answer. It's harder, and it takes longer, to make connections and develop that sort of understanding. I agree with Epstein that the school system doesn't provide most students with long-lasting knowledge. But mass education is an incredibly complex issue, so it's easy to critique its flaws but hard to provide any feasible solutions. The push towards specialization continues as we approach graduation and must then choose if we want to pursue post-secondary school. When we go to college, we have to pick what we want to study and build a career out of for the next 40 years. Making that sort of decision when we're 17 is kinda crazy when you think about it. Of course, by the time you're 10 or 15 years removed from your undergraduate studies, what you studied is often irrelevant to your current work. I don't have any numbers to back that up so just take it as anecdotal evidence. Or this might be a case of the Igon Value Problem ... The point I'm trying to re-iterate from Range is that life will be easier the more you specialize and stick to what you already know. What the specialist misses out on is the increased perception and understanding of the world that's gained through diverse experience. You'll also learn more about yourself by diversifying, something that's hard to quantify: Ibarra marshaled social psychology to argue persuasively that we are each made up of numerous possibilities. As she put it, "We discover the possibilities by doing, by trying new activities, building new networks, finding new role models." We learn who we are in practice, not in theory. — Epstein, pg. 130 It's just like going on vacation with a significant other for the first time. Being far away from your regular environment and routines with each other will be an invaluable test for the relationship. It could potentially make or break it—but it's better to know, isn't it? In conclusion, I didn't hate Range by David Epstein. But it was too long and tiring to read. I'd recommend an audiobook version, or just find a good blog post online explaining the main concept. Remember: If you're a frog, try to climb a tree once in awhile to check out the whole swamp. And if you're a bird, don't be afraid to swoop down and play in the mud sometimes. ^1^ No, this is not the guy that owns the vaccuum company, but he has invented some other wild physics concepts like the Dyson Sphere .

0 views
flowtwo.io 5 years ago

Stuff I Dug in 2020

I like stuff. Like books, clearly , but also other stuff. Here's a list of my favourite music, film, and TV from this past year. This also happened to be the number 1 song on my Spotify Wrapped this year. The Districts are one of my favourite bands to begin with, and they released their 4th studio album You Know I'm Not Going Anymore in March of this year. The lead single, Hey Jo , is an energetic, wild ride of a song that instantly got me pumped for the album. The Districts have evolved on their raw, distorted sound and added crisp, intriguing production to their music, which is showcased in this 4th album. Hey Jo is perfect example of that, moving seamlessly from subdued, seething lyricism in the opening verse into crushing drums and an anthemic chorus led by lead singer Rob Grote's infectious howling. The album was a great companion to rock out to a pandemic with, even though the pandemic also cancelled their tour and the show I had tickets to see. Hoping to see them in 2021 so I can hear this album live finally. I first discovered this artist when I heard Gospel for a New Century on Hype Machine . After that, I listened to the entire album. Then I listened to it a few more times. Every time I listened, I gained more appreciation for it as a body of work. Heaven To A Tortured Mind could be categorized as post-industrial neo-psychedelic hypnagogic pop music. But I don't know what any of those things mean, so I'm not gonna do that. What I do know is that Yves Tumor made some amazing music on this album. It's definitely experimental—each song features diverse instrumentals and unconventional composition. But it's also very catchy, pop-inspired music at it's core. The album opens with its poweful lead single Gospel for a New Century ; blaring horns and a crooning chorus that sets the stage for the rest of the record. Over the next few songs Tumor embodies the roles of a tortured rockstar, exploring love and loss in an emotional soundscape, culminating in a two-part crescendo of Romanticist and Dream Palette —my favourite songs on the album. Dream Palette consists of Tumor singing alongside featured vocalist Julia Cumming, frantically searching for each other among drums, fireworks, and a discordant piano. They finally harmonize by the chorus and, together, posing an essential question to themselves, and the listener: Floating through, what feels like A declaration of love Our hearts are in danger Tell me, is this fundamental love? A feeling, you deserve A stare you, you've seen before Our hearts are in danger Tell me, is this confidential love? I don't know what it means, but it sounds really fucking cool. This kid has got the Midas touch. Every song he touches turns to gold. He's got a great voice, and his rapping and lyricism are top-notch also. His sound sits somewhere in a space between the sing-rapping of Drake and the alternative R&B vibe of Miguel or Khalid. And he deserves to be a big as any of those artists within a few years. Excited for more Col3trane in 2021, which will hopefully include a debut album. I don't have much to say about 1917 because it's simply an exhilarating cinematic experience. The movie only gets better with the quality of your audio-visual setup. I was fortunate to watch it in theatres, in IMAX, and it was incredible. If you watch it on your phone or something, it's not going to be the same experience. The story is forgettable—it's a World War I movie about two soldiers delivering an important message across enemy lines. It's impact lies in the single-shot format utilized throughout the entire film . The camera is always focused on the two protagonists. This provides an intimate view into their world and it completely immersed me in the movie. it's like the WWI version of Sam and Frodo I'm sure it took an incredible amount of direction and technical achievement to produce a movie like 1917 —I've never seen another film like it. The cinematography was top-notch and I'm a sucker for good cinematography. No No: A Dockumentary is about the life of Dock Ellis, an MLB pitcher from the 1970s, who is known for pitching a no-hitter while high on LSD. I went in assuming this would be an off-beat documentary centred around Dock's infamous game, but it turned out to be much more. The documentary provided a surprising examination of the cultural climate in the 1960s and '70s; the politics, the racism, the hippie movement, and baseball. They did a great job capturing the world that Dock Ellis lived through during his baseball career. Dock's life turned out to be fascinating. He was a larger-than-life personality and his life took many twists and turns. He was able to turn around a hard dependence on drugs and alcohol and become a counsellor in his life after baseball; helping adolescents and the incarcerated with their own substance use disorders. He was able reach hundreds of kids with his words and his story, changing their lives and improving his community, before his death in 2008. No No: A Dockumentary was a great documentary about a really interesting person. It was a vibrant and playful film accompanied by a '70s inspired soundtrack—a fitting portrait for a man who never quite wanted to fit in. After watching Parasite , I needed more Bong Joon Ho. During quarantine back in the spring, when my roommate and I were both spending every day at home, we decided to rip through Ho's entire filmography. I love his original stories and willingness to approach difficult or strange topics that don't often get screen-time. He's also great at building tension and suspense in his movies; they've all been wildly entertaining so far. My favourite, besides Parasite , was Mother . It was crazy. Chernobyl is an amazing 5-part HBO series which came out in 2019. It won the Emmy for Outstanding Limited Series last year, so I'm clearly not alone in my praise for this show. I loved it's portrayal of the tension between scientific truth, bureaucracy, and politics. The nuance and realism was captured so well. The final episode is the highlight of the series, elevated by Legasov's impassioned speech at his trial. It was a remarkable example of the simultaneous power, and danger, that truth holds. we get it you vape bro As we all know, 2020 was an absolute dumpster fire of a year. Even though watching World War I movies and shows about nuclear disasters is fun, sometimes you just want to watch something a little more lighthearted. Ted Lasso started out as a series of commercials for NBC to promote their coverage of the English Premier League a few years ago, with Ted Lasso played by Jason Sudeikis. It was picked up last year by Apple TV and made into an entire show with basically the same premise (just with a lot more Apple products in it now). Despite the show's bastardized origins as a commercial piece of garbage, it's actually really funny. It's easy to watch and it's cheerful. Sudeikis does a great job at playing the most warmhearted and genuine character I've ever seen on TV. I highly recommend watching it for anyone dealing with sadness, stress, ill-health, poverty, hemorrhoids, autocratic dictatorships, religious persecution, or existential crises stemming from the threat of super-intelligent AI robots taking over the world. Or just if you're bored. This was an absolutely jaw-dropping 56 minutes of pure emotion. Mr. Robot has been my favourite show over the last 5 years and this final season was a treat to watch. Director Sam Esmail has focused on cinematic excellence throughout the series and he hasn't been afraid to experiment with interesting ways of telling Elliot's story. From single-shot episodes to '80s sitcom parodies, Esmail's creativity is why I fell in love with this show. This episode was aired in its entirety with no commercial breaks, something I'm sure Esmail had to pay for, reflecting his commitment to artistic integrity in his work, and Mr. Robot in particular. 407 Proxy Authentication Required is delivered like a Shakespearean play. It actually has title cards breaking it up into 5 acts. The theatrical tone is amplified by the original musical score (produced just for this episode) and the cinematic aspect ratio. From the acting, to the cinematography, to the story, it's a flawlessly executed episode of television. The episode also contains probably the biggest story reveal of the entire show, revolving around a crucial detail of Elliot's life. It's a tragic and powerful moment that was delivered with an incredible performance by Rami Malek. Watching 407 Proxy Authentication Required was incredible and made watching the entire Mr. Robot series worthwhile, even though it was already.

0 views
flowtwo.io 5 years ago

The Mezzanine

I was in riding in a car with a couple friends recently. One of them was talking about a book she'd just finished called American Psycho and how much she hated it. I've heard of it, as well as the movie, but to be honest I've never read or watched either. She described how it was a waste of time, and boring, because the narrator would spend pages upon pages describing every minutia and going into obscene amounts of detail about things ( "He talked about about ONE suit for five pages straight!" ). The narrator is also a serial killer apparently. What my friend was describing sounded incredibly similar to the book I was reading at the time: The Mezzanine . I told her it could be worse, my protagonist doesn't even have any interesting traits like being a psychopathic serial killer...he's just an office worker. Instead of suits, it takes multiple pages to describe the history of his ear plug preferences: I used Flents Silaflex silicone earplugs. Only since 1982 or so have these superb plugs been generally available, at least in the stores I visit. Before that I used the old Flents stopples, in the orange box—they were made of cotton impregnated with wax, and they were huge: you had to cut them in half with a pair of scissors to get a shape that would stay put when you worked it in place, and they left your fingers greasy with pink paraffin. They revolted L., who used to store any I left on her bedside windowsill in an empty pastilles canister with a rural scene on it—and I don't blame her. Then a company called McKeon Products began to be a force in the market, offering Mack's Pillow Soft® earplugs[...] — Nicholson Baker, The Mezzanine , pg. 15 It keeps going but you get the point. The Mezzanine is one of the most unique books I've ever read. It reminded me of David Foster Wallace's writing a bit, and not just because of the extensive use of footnotes. It's kinda like if you paused the plot of a David Foster Wallace novel for 135 pages, you'd get The Mezzanine . There was no plot in this book, and you could argue that the entirety of the book takes place within a single escalator ride. The book begins with our narrator starting his escalator ride, and it ends with him stepping off that same escalator ride—everything else is tangential thinking emanating from his consciousness in these moments. This stream-of-consciousness is a wild ride to be on. The narrator, who's name is Howie, is an extremely neurotic, introspective young man grappling with the tedium of a standard white-collar career and the gradual realization that he isn't very special and won't amount to anything great in life. The feeling that you are stupider than you were is what finally interests you in the really complex subjects of life: in change, in experience, in the ways other people have adjusted to disappointment and narrowed ability. You realize that you are no prodigy, your shoulders relax, and you begin to look around you, seeing local color unrivaled by blue glows of algebra and abstraction. — Baker, pg. 15 Howie's penchant for nostalgia and reflections on his young life are recurring subjects in his thoughts. He seems reminiscent of these early years and I got a sense that he misses them and the way things used to be. But, at the same time, Howie also seems wholly devoted to his life of adulthood and its routine nature. Howie has an interest in mechanical systems and the ingenuity that has gone into their designs. From escalators, to staplers, to milk cartons, Howie describes the clever aspects of their design with a sense of genuine appreciation and excitement: the radiant idea that you tore apart one of the triangular eaves of the carton, pushing its wing flaps back, using the stiffness of its own glued seam against itself, forcing the seal inside out, without ever having to touch it, into a diamond-shaped opening which became an ideal pourer, a better pourer than a circular bottle opening or a pitcher's mouth because you could create a very fine stream of milk very simply, letting it bend over that leading corner, something I appreciated as I was perfecting my ability to pour my own glass of milk or make my own bowl of cereal—the radiant idea filled me with jealousy and satisfaction. — Baker, pg. 28 Howie likes to see his own life as a system as well—a conveyor belt of recurring days filled with work, lunch hours, driving, and everything in between. He takes pride in being the operator of this personal existence factory; a master of his domain, always looking for ways to cut costs and streamline processes. Like when he figured out how to apply deodorant while fully clothed, Howie appreciates the little things in life. I think that's great. This book felt like the 1980s version of this video: From the novel's synopsis and the blurbs included on the book's jacket, it's apparent that The Mezzanine is supposed to be funny. I mean, in a way it is funny—but it seemed like the joke was on me. Reading about shoelace abrasion theories for multiple pages in a row felt like I was being pranked somehow. If I cast aside my pessimism over the pointlessness of the novel, I admit that it was really well written. It's humour stems from the dazzlingly sharp focus that Baker applies to the banality of modern life. His descriptions of the ordinary—putting into words the small things we all recognize but never materialize in our heads, are clever and refreshing. It's similar to the observational humour used by comedians like Seinfeld; they re-frame our perception on certain aspects of life by describing them in a unique way. The best description I've found for what the fuck is going on in this book is, obviously, in it's Wikipedia article: The substance of the novel, however, is taken up with the thoughts that run through a person's mind in any given few moments, and the ideas that might result if he or she were given the time to think these thoughts through to their conclusions. — Wikipedia, "The Mezzanine" ( link ) This makes sense to me and it really explains what Nicholson was trying to accomplish with The Mezzanine . It's an exploration of the infinitude of thoughts and half-formed ideas that float through our consciousness throughout the day. In a way, it's a celebration of the human mind and its marvellous complexity. I enjoyed reading The Mezzanine , but I'm not sure I would recommend it to anyone. I felt like I was getting pranked and I chose to read it on my own volition—I'm not trying to lose any friends because of a book recommendation. But maybe if you're looking for something of a different flavour, and you've grown sick of all the plot and character development in those ordinaryyyyy books, then consider giving The Mezzanine a shot. It's only 135 pages. I guess part of why I'm not too upset about it is because I read the entire thing while sitting on the toilet. So it wasn't time wasted, just time spent.

0 views
flowtwo.io 5 years ago

Norwegian Wood

Long after the firefly had disappeared, the trail of its light remained inside me, its pale, faint glow hovering on and on in the thick darkness behind my eyelids like a lost soul. More than once I tried stretching my hand out in that darkness. My fingers touched nothing. The faint glow remained, just beyond their grasp. — Haruki Murakami, Norwegian Wood , pg. 87 There's something really entrancing about reading Murakami. It's easy to read, but not because it's simple writing. His use of pace and language is rhythmically consistent, creating a dreamy atmosphere that lulls you in. The writing is also beautiful, at times rising to incandescent lyricism. Even the descriptions of the mundane activities of Murakami's characters aren't boring to me. I think it's because his stories take place in Japan, and the routine aspects of life in Japan are novel and interesting for Western readers. The things they make for dinner, what they do for fun, the sights and sounds of Tokyo...these elements of Japanese life are an additional aspect that makes Murakami's writing enjoyable. I've only read 1Q84 and Norwegian Wood so far, but I look forward to reading more of Murakami's work in the future. Norwegian Wood is similar to 1Q84 , but it's more grounded in reality. Both stories revolve around lovers who cannot be together. Both are set in Tokyo, and both are marked by themes of loneliness and isolation; apparently this is common in all of Murakami's work. I liked Norwegian Wood more, probably because it was a third of the length. Norweigan Wood tells a story about love and death and how our understanding of these things change as we grow older. Our protagonist is Toru Watanabe, a young man living in Tokyo, originally from Kobe, who is caught in the midst of a love triangle^ 1 ^. Naoko is one vertex of this triangle; she and Toru are bound by mutual tragedy. Her boyfriend, who is also Toru's best friend, commits suicide when they are 17. Toru and Naoko both move to Tokyo for college and after a short time together there, Naoko leaves to join a sanatorium in order to regain her mental health. Toru is in love with Naoko and despite her leaving, he still feels obligated to her: And besides, I still loved Naoko. Bent and twisted as that love might be, I did love her. Somewhere inside me, there was still preserved a broad, open space, untouched, for Naoko and no one else. — Murakami, pg. 485 Toru's commitment to Naoko prevents him from loving anyone else, which makes his relationship with Midori complicated. Midori is a lovable, passionate girl who Toru would like to be with, but he feels it would be unfaithful to Naoko. In metaphorical terms, I'd say that Naoko represents death, and Midori represents life. Toru is afraid of death, and this is preventing him from living freely. I like how Murakami presents the struggle of love. Loving someone else is not always a gift, it can be a burden as well. But the struggle is what makes love genuine and worthwhile. The most poignant thoughts on love come near the end of the book, when Toru receives a letter from Reiko, Naoko's roommate at the sanatorium. It's in response to Toru's letter seeking advice on his torn feelings towards Naoko and Midori. Murakami shares some beautiful and gentle advice on love, and life, that goes beyond just Toru's situation: Things like that happen all the time in this great big world of ours. It's like taking a boat out on a beautiful lake on a beautiful day and thinking both the sky and the lake are beautiful. So stop eating yourself up alive. Things will go where they're supposed to go if you just let them take their natural course. Despite your best efforts, people are going to be hurt when it's time for them to be hurt. Life is like that. — Murakami, pg. 487 I read Reiko's letter several times before moving on to the next chapter—the first sentence of which hits you like a ton of bricks. Immediately after reading this letter, Toru, and the reader, find out that Naoko committed suicide. This letter was meant to be an exoneration for Toru from his self-imposed responsibility to Naoko. Reiko was giving license to Toru to live, and love, in any way that made him happy. I believe Toru would have followed through on this advice, but only under the pretense that Naoko was sill alive and getting better. Her death pulls him right back in and launches him into despair. It doesn't sound like he ever recovers. The book ends with Toru asking himself a question: Where was he? At last, Midori's quiet voice broke the silence: "Where are you now?" Where was I now? Gripping the receiver, I raised my head and turned to see what lay beyond the telephone booth. Where was I now? I had no idea. No idea at all. Where was this place? All that flashed into my eyes were the countless shapes of people walking by to nowhere. Again and again, I called out for Midori from the dead center of this place that was no place. — Murakami, pg. 531 The broader question implied here, as Toru is phoning Midori to get back together with her, is whether they do in fact stay together. Murakami leaves this open to the reader, partly because a clear-cut happy ending would be inappropriate for the story. But I believe there's enough context to suggest Toru doesn't stay with Midori. Toru ends up following a life path similar to Nagasawa, his friend from college. Toru will become career-obsessed and go off to travel the world, possibly as a writer or journalist, leaving Midori and happiness behind. Nagasawa sees the similarity between himself and Toru, and he remarks on it during dinner with Hatsumi earlier in the story: "[...]Toru's practically the same as me. He may be a nice guy, but deep down in his heart he's incapable of loving anybody. There's always some part of him somewhere that's wide awake and detached. He just has that hunger that won't go away. Believe me, I know what I'm talking about." — Murakami, pg. 384 This inability-to-love-somebody is a commonality between the two of them, but their motivations differ beneath the surface. Despite Nagasawa's success and good fortunes, he's cursed by an incurable loneliness—his own "special hell" as Toru describes it. Nagasawa cannot love, or even empathize, with anyone and tries to hide this void through a life of hedonism and promiscuity. Toru, on the other hand, cannot love because his love has already been spent. He loves Naoko, and is bound to her by tragedy and loss. He wants to be there for her forever, unlike Kizuki: And all because you killed yourself and left Naoko behind. But that's something I will never do. I will never, ever turn my back on her. First of all, because I love her, and because I'm stronger than she is. And I'm just going to keep on getting stronger. — Murakami, pg. 448 During the course of the story, Toru is determined to become the person that saves Naoko. After Naoko's death, this determination becomes Toru's tragic fate—her death does not free him as you might expect. Naoko remains with Toru in his heart. Unreachable, but always with him. In the end, Toru will choose Nagasawa's way of life, a life of isolation, over Midori and a life of passion. Looking at his life 20 years later, in a plane on the way to Germany (not for the first time), the Toru we meet is still haunted by those dreams of Naoko. Furthermore, we know that Germany was at least one destination for Nagasawa in his career as a foreign diplomat^ 2 ^. I believe this is a hint from Murakami alluding to the similar fates of Toru and Nagasawa. The idea of flying away to various parts of the globe is much like running away, or at least always running. Nagasawa will never stop in his conquest to control life. To grab life and laugh in its face. I think Toru follows him into this life of nothingness—a place that is no place. The irony of this decision is made clear by Toru's statement at the end of the first chapter. After years of reflection and thinking about his love for Naoko, he arrives at a sad conclusion: Naoko didn't love him back. After finishing this book, I highly recommend re-reading the first chapter. It's more emotional once you know about Toru's relationship with Naoko. The last line of the chapter was particularly illuminating: The more the memories of Naoko inside me fade, the more deeply I am able to understand her. I know, too, why she asked me not to forget her. Naoko herself knew, of course. She knew that my memories of her would fade. Which is precisely why she begged me never to forget her, to remember that she had existed. The thought fills me with an almost unbearable sorrow. Because Naoko never loved me. — Murakami, pg. 16 This is quite a sad realization for Toru to reach, especially if you subscribe to my theory that he was irreparably effected by Naoko's death in such a way that he couldn't love again. As a young man, he completely devotes himself to Naoko and is deeply in love with her. How did he come to the realization that Naoko never loved him back? It may be an emotional reaction; he devoted his life to her while she responded by killing herself. Toru may feel guilty and upset that he wasn't "enough" to keep her from ending her life, so he concludes that Naoko simply must've not loved him at all. This is easier for Toru to digest rather than the more complicated answer: Naoko did love him, but she was just too depressed and traumatized to continue living. It's also possible that Naoko actually didn't love Toru. For one thing, she never says or does anything particularly affectionate towards Toru. Between the letters to each other and their conversations when Toru visits her at the sanatorium, Naoko doesn't appear to want to be with Toru very much, now or in the future. She actively tries to convince him that staying with her is foolish: And that's why I want you to go on ahead of me if you can. Don't wait for me. Sleep with other girls if you want to. Don't let thoughts of me hold you back. Just do what you want to do. Otherwise, I might end up taking you with me, and that is the one thing I don't want to do. I don't want to interfere with your life. I don't want to interfere with anybody's life. Like I said before, I want you to come to see me every once in a while, and always remember me. That's all I want — Murakami, pg. 270 I don't think Naoko even had the capability to love someone else. She had been together with Kizuki since they were kids and his suicide, in addition to her sister's suicide, was too much for Naoko. She was consumed by sadness, often hearing Kizuki and her sister calling out to her from the shadows, wanting her to join them. She was torn by this and it left her without capacity to love someone. Naoko may also resent Toru for when he took her virginity on her 20th birthday. Reiko relays this information to Toru after Naoko's death, that their sex had deeply troubled Naoko and she never wanted to have sex again after: I just don't want anybody going inside me again. I just don't want to be violated like that again—by anybody.' — Murakami, pg. 541 Did Toru "violate" Naoko on her birthday? I re-read the scene of that night to see if their sex was consensual; besides a few mentions of physical reciprocation from Naoko, it doesn't sound like Naoko necessarily wanted it. She was in an extremely fragile emotional state after drinking a lot of wine and crying a lot. Maybe Toru took advantage of her in the situation. It's hard to say for sure whether Naoko ever felt love for Toru. It's clear she appreciated Toru's friendship and cared for his well-being—but she never reciprocated his love towards her. Naoko was consumed by the loss she'd experienced and was not able to reconcile her feelings, although she tried. I think Toru realized this eventually. Norwegian Wood is about death and the lives of those around death. Some characters, like Naoko, are broken by their tragedy, but others have found ways of living with death. Midori loses both her parents, but she continues living with passion and vitality. Reiko had her family taken from her and her life ruined, but she found a new life in the mountains. Nagasawa...well Nagasawa just seems to be a nihilist but at least he has a lot of fun in life! Hopefully I'm wrong and Toru wasn't permanently broken by Naoko's suicide. Hopefully he learned to live with death and love again. ^1^ I've never understood why they are called love triangles , since normally it is between three heterosexual people, two of the same gender...it seems to me like more of a love V . If we understand the edges of the triangle to represent romantic desire, then there's really only two possible pairings between the 3 members of a classic heteronormalized triangle. Maybe if one of the participants is bisexual and also happens to be romantically interested in the other same-gendered participant, thereby creating the necessary enclosing 3rd edge, then we could geometrically refer to that situation as a triangle. But even in this scenario, calling it a triangle implies a sort of bidirectionality that isn't really present; the affection represented by the three lines isn't mutual in all cases. I think the only interpersonal love dynamic that I'd truly be comfortable with calling a triangle would be between three homosexual individuals, each of whom is romantically interested in the other two. Now that would be a love triangle I'd read a book about! ^2^ From page 213: Two years after Nagasawa left for Germany , she married, and two years after that she slashed her wrists with a razor blade.

0 views