Latest Posts (15 found)
Can ELMA 3 weeks ago

AI Was Supposed to Help Juniors Shine. Why Does It Mostly Make Seniors Stronger?

The question “Will coding be taken over entirely by AI?” has been asked to death already, and people keep trying to answer it. I’m not sure there’s anything truly new to say, but I want to share my own observations. The early narrative was that companies would need fewer seniors, and juniors together with AI could produce quality code. At least that’s what I kept seeing. But now, partly because AI hasn’t quite lived up to the hype, it looks like what companies actually need is not junior + AI , but senior + AI . Let’s look at where AI is good and where it falls short in coding. Where it helps: And who benefits most from that? Obviously seniors. In the hands of a junior, these things are harder to turn into real value. Still possible, but much tougher. Where it backfires: There are more examples, but the main point is this: AI is not really a threat to senior developers yet. It may even be the opposite. And this is not about criticizing juniors. It is about not throwing them into risky situations with unrealistic expectations. Where we should use AI: From my perspective, that is the current state of things. We still have to read every line AI writes. It is far from perfect. No awareness. Reasoning is imitation. It is non-deterministic, which is why we rely on deterministic things like tests. But then, are you really going to trust the AI to write the tests that verify its own code? It reminds me of something I tweeted: there was a prompt making AI say “I don’t know” when it didn’t know. My take was: “If such AI says ‘I don’t know,’ you can’t be sure it knows that either.” Of course, the junior + AI pairing was tempting. It looked cheaper, and it fed the fear that “AI will take our jobs.” But when you compare software to other professions, the field still shows signs of immaturity. In construction, architects design. In software, even the architects are still laying bricks by writing code. Our roles are still not specialized or merit-driven enough, and cost-cutting dominates. That devalues the work and burns people out. So instead of democratizing coding, AI right now has mostly concentrated power in the hands of experts. Expectations did not quite match reality. We will see what happens next. I am optimistic about AI’s future, but in the short run we should probably reset our expectations before they warp any further.

1 views
Can ELMA 2 months ago

Postmortem: How I Crashed an API with a Cloudflare Compression Rule

Sometimes the most valuable lessons come from our biggest mistakes. This is the story of how a single misconfigured Cloudflare compression rule broke our Server-Sent Events (SSE) streaming and brought down an entire API for several hours. Date : August 15, 2025 Duration : 4 hours 23 minutes Impact : ~20% API downtime, 15,000+ affected users Root Cause : Cloudflare Compression Rule Breaking SSE Streaming I was working on performance optimization for our API endpoints. The goal was to reduce bandwidth usage and improve response times by enabling Cloudflare's compression features. I enabled the Cloudflare compression rule: The issue wasn't immediately apparent. The compression rule looked safe, but I had forgotten a critical detail: our API used Server-Sent Events (SSE) for real-time streaming, and Cloudflare's compression breaks SSE . Cloudflare Compression Breaking SSE : The compression rule was enabled without understanding that it buffers data, breaking real-time streaming. This incident taught us that compression isn't always beneficial — it can break real-time protocols like SSE. The key lesson is to understand how infrastructure changes affect your specific use cases, especially streaming protocols.

0 views
Can ELMA 8 months ago

ClothShift: Transforming Fashion Experience with AI

The fashion industry is undergoing a digital revolution, and at the heart of this transformation is artificial intelligence. ClothShift represents the next generation of fashion discovery, where AI doesn't just recommend clothes—it creates realistic virtual try-ons, enables powerful image editing, and generates lifelike models for creative and business use. Traditional fashion shopping has several pain points: ClothShift's AI creates stunning, lifelike try-ons that users can trust: The platform goes beyond simple try-ons with powerful creative tools: ClothShift delivers a seamless experience across all devices: Building a comprehensive AI-powered fashion platform requires sophisticated technology: The platform has achieved impressive results based on actual user data: Challenge : Creating realistic clothing visualization on diverse body types Solution : Advanced machine learning models trained on extensive fashion datasets Challenge : Maintaining feature parity between mobile and web Solution : Shared backend API with platform-specific optimizations Challenge : Making complex AI features accessible to non-technical users Solution : Intuitive interface design with guided workflows ClothShift is just the beginning. The future holds: Virtual try-on was the core feature that users immediately understood and valued. The technology serves the user experience, making complex tasks simple and enjoyable. Different users prefer different platforms, so multi-platform support is essential. Every user interaction provides valuable data for improving AI models and features. ClothShift is expanding to include:

0 views
Can ELMA 1 years ago

Telegramic: The Road to 400k Users and a Successful Exit

Building a successful product is one thing, but scaling it to hundreds of thousands of users and achieving a successful exit is a completely different journey. This is the story of Telegramic, from its humble beginnings to becoming the go-to platform for Telegram discovery. It all started in 2018 when I noticed a gap in the Telegram ecosystem. While Telegram was growing rapidly, there was no centralized place for users to discover quality bots, channels, and groups. The idea was simple: create a social directory that would help users find what they were looking for. The initial launch was modest, but the response was immediate. Telegram users were hungry for discovery tools, and developers needed a way to showcase their creations. Within months, we had: We didn't just build a directory; we built a community. Users could submit content, rate quality, and engage with developers directly. The platform was built with scalability in mind from day one. Django + PostgreSQL + Celery provided the foundation for handling millions of requests. We actively partnered with major bot developers and Telegram channels, creating a network effect that accelerated growth. Simple, intuitive design that made discovery effortless. Users could find what they needed in seconds, not minutes. By 2023, Telegramic had become the de facto standard for Telegram discovery. We had: When acquisition offers started coming in, we knew it was time to consider the next chapter. The exit wasn't just about the money; it was about ensuring the platform could continue growing under new leadership with more resources. Telegramic succeeded because it solved a genuine pain point in the ecosystem. Users needed discovery tools, and we provided them. Building a product is one thing; building a community around it is what drives sustainable growth. Investing in proper architecture from the beginning paid dividends when scaling to hundreds of thousands of users. We launched at the perfect time - when Telegram was growing but lacked discovery tools. While I'm no longer directly involved with Telegramic, the platform continues to thrive under new ownership. The new team has expanded features, improved the user experience, and continued growing the community. The Telegramic experience taught me invaluable lessons about building and scaling products. I'm now focused on new ventures, particularly in the AI and mobile space, applying the same principles that made Telegramic successful.

0 views
Can ELMA 3 years ago

Ignore files without modifying .gitignore

The file is also tracked so you've to expose a list of your ignored files when you added them to the file. This may not be what you want. In the git folder, there's a file called at . This file just works like the .gitignore file but without being tracked. You can ignore untracked files with this file without exposing them. The exclude file does the job if you only want to ignore untracked files. But what if those files are already tracked? In such cases, you can use the below commands to ignore a specific file or path. To create aliases for these commands:

0 views
Can ELMA 4 years ago

Managing dotfiles with Stow

Dotfiles are plain-text configuration files with a at the beginning of their filename which makes them hidden. These files may be configuring your git tool, or functions you use in the shell, or something else. If you're on a GNU/Linux distribution or similar, you already have many dotfiles. For example, check the file, or if any. As you make your own changes to these files, you will want to be able to keep and manage these changes. The Stow tool helps me a lot in this regard and I can manage my dotfiles with a version control system such as git. It's simple. Let's go through an example. We want to manage our SSH related configuration files. We can find these files in the folder. First, create a folder named , and another folder named in it. Now copy your folder with its content to this new folder, excluding your private files you don't want to expose in your VCS. You'll have a folder hierarchy like this: Now our ssh module can be managed with Stow. Move this folder to where you want to keep all your dotfiles. Remember that if you want to use Stow without specifying a target directory, you need to keep the directory in your home directory. I keep it as a hidden folder in my home directory for easy use: . After taking a backup of your folder, delete it for testing. Run the following command in the directory: With this command, Stow creates symbolic links in your home directory, preserving the folder structure. What just happened is that you now have a folder again, but this time with symlinks to the directory. So the actual files are in : When you switch to a new computer, all you have to do is to transfer your folder to this computer and stow the modules you want with the command. Whatever you put inside the folders (modules) you created in the dotfiles folder, the Stow tool will create symbolic links to them in the parent directory by default, preserving the folder structure. That's all. You can change the default target directory with the option.

0 views
Can ELMA 4 years ago

Using ngrok in Docker

You can run multiple services as a single app with . If you are using ngrok in your development environment, you may also need to make it a Docker service. Should you? Probably not. But if you want to, you will see how to do it in this article. First of all, create a folder named . You should then create a symbolic link in that folder that references to the actual ngrok executable. For instance, In the ngrok folder, you now have a symbolic link to the ngrok executable. Next, you should create a configuration file for ngrok. In the same ngrok folder, create a file like the one below: Then define your ngrok service in the file: Did you notice the section in the above file? You can find more about it in this article . You can access it on . You may also need to access the ngrok service from other services. Moreover, you may want to get the current ngrok address dynamically. Below is a sample Python code I wrote for this: This will return the public URL of the running ngrok instance according to the HTTP protocol you specified.

0 views
Can ELMA 4 years ago

The multilingual part of my site built with Nuxt

In this article we'll inspect how the multilingual part of my website works. As seen below, I have defined only two languages. The translation files ( and ) for these languages are located in the folders. With the option set to true, the files will be loaded only when needed. This is not that important though as they are small size files. I disabled the option. Since the is set to , the language prefix will be added to the address for all languages except the default one: -> . There are two flag icons for the supported locales. The flag of the active locale becomes visible. Here's a working example: . Clicking a flag icon switches the locale. This makes more sense, as my site is only available in two languages. When changing language with , you can keep a record of the language being changed or check it on the new route. So if the URL that's visited after the language change gives a 404 error, you can show an error message saying that the page does not have a translated version in that language, instead of showing a regular 404 error message. When the language is changed, if it is not the default one, a language prefix will be added to the URL. Otherwise the language prefix will be removed. Below is the current folder structure I use for my site. From now on, whether you're checking for a post or fetching it, you should always include the language code in your query. There are some other details but I won't talk about them as they are directly related to the installation and use of the and modules. That's all. This is the entire custom structure I use on my site.

0 views
Can ELMA 4 years ago

Organizing .md files with timestamps in Nuxt

If you are using static site generators, you will be dealing with Markdown files with the extension. Confusion arises especially as the number of .md files increases. For example, let's see the following example: Wouldn't it be better if it was as below: The order is now more apparent, and we can easily see at a glance which files belong to which year or month. Now I will explain how we can create this structure in Nuxt. Let's start by making the following change. Now Nuxt will automatically remove the leading date when creating the for these files. This change necessitates another minor change. Normally, when visiting the path , we would find the corresponding file inside the folder using the part ( ) of that path: This is no longer possible as the relation between the and the filename is corrupted due to the change we made. We can solve this. You should fetch the content as below:

0 views
Can ELMA 4 years ago

Serving TXT files with Django

There are various TXT files used for different purposes like informing crawler robots, domain name verification or identity verification. One of them is a file. Sooner or later you will need to serve a file with Django dynamically, in order to tell web crawler robots which parts of your website is open to search engines. Let's see how we can do it. robots.txt : A robots.txt file tells search engine crawlers which URLs the crawler can access on your site. This is simple and easy to understand. We define a string variable that provides the content of the TXT file and when the target URL accessed we serve that content in response with the content type . With this method, we'll create a file and serve it as a template. We can use function-based or class based views. As we will serve static files via NGINX (or something similar), this method looks more accurate to prefer. However, each of these methods in this post has advantages and disadvantages. For instance, while you can write tests in other methods, you can't with this one in Django. Just add a line to your NGINX configuration:

0 views
Can ELMA 4 years ago

Project-specific commands with just

Every project has its own specific requirements. Most of the time, we can meet this need with commands. But when it comes to writing and running commands, sometimes shell scripts can be a bit low level. This is where just handles the situation better. It functions as an abstract layer between your shell and the project and helps you run commands in a handy way. You can think of it like Makefile. But unlike Makefile, just files are more readable and easier to write. You can find OS-specific installation instructions on its GitHub page . The configuration files for the tool are named by default. Below is an example : In the above example, there's the and commands. You can pass a parameter to the command. Did you notice the character in the statement? prints each command to standard error before running it. This is suppressed for lines starting with . It's better not to get into giving more examples as it has too many features. You can check it out on its GitHub repo by clicking on here .

0 views
Can ELMA 4 years ago

Daphne and/or Gunicorn: ASGI, HTTP/2, WebSockets

Gunicorn is a WSGI HTTP server for Python applications. It can serve your dynamic Python application with the help of a reverse-proxy program like Nginx. Daphne is like Gunicorn but for ASGI. It also supports the HTTP/2.0 and WebSocket protocols in addition to Gunicorn. Imagine you are going to write a Django chat app. You would have to use WebSocket for a real-time and non-polling chat app. Gunicorn cannot do this alone. You can still serve your Django application with Gunicorn, but you should at least use Daphne to handle requests sent over the WebSocket protocol and a reverse-proxy like Nginx to send those requests to the Daphne service. For example, you can redirect all the connections on the path to your Daphne service, while redirecting the rest to Gunicorn. Although this comparison may seem like comparing apples and oranges at first glance, it's still important. That's because unlike Gunicorn, Daphne can also do what Gunicorn can, like handling traditional HTTP requests. This makes it possible to use only Daphne for any type of request in your, say, chat application. However, you might wonder if it's worth choosing Daphne while everyone else seems to be using Gunicorn for HTTP requests. + Handles any type of requests like HTTP, HTTP/2, WebSocket + You don't have to reverse-proxy WebSocket requests by the path (like ) - No proven stability? - No enough community support? + Proven stability + Community support - No support for HTTP/2, WebSocket - You'll need a reverse-proxy + another server for unsupported request types This is another solution. We can redirect incoming requests to appropriate services via Nginx. Below is an example Nginx configuration: With this Nginx configuration, the connections to the path will be redirected to the service running on the port , while the rest will be handled by the service, which could be a Gunicorn server running on the port .

0 views
Can ELMA 4 years ago

Docker profiles: Scenario based services

Sometimes you need some services only in certain scenarios. Docker Compose has a handy feature to achieve this; profiles. Here's a sample file below for the examples in the following sections. There are profiles assigned to two of the services in the above example. The service is assigned to the profile, while the service is assigned to the profile . Unlike other services, and services do not have profiles. So they always start. But the and services will only be started when you activate their profiles. The below command would start the and services by default. The following command would also start the service along with the and services. That's because we enabled the profile: The below command would enable multiple profiles, and , at once and start the and services besides the and services. You can also use the environment variable to specify which profiles to enable:

0 views
Can ELMA 4 years ago

Easily SSH into a VirtualBox machine

You have a virtual GNU/Linux or Windows machine on VirtualBox and want to establish an SSH connection with it. Here's what you need to do. Terminology: host: Your own machine running VirtualBox. guest: The virtual machine you created on VirtualBox Shutdown the guest OS if it's already running. Open the settings screen of the guest machine on VirtualBox and navigate to the Network section. In the tab, set the to . Expand the settings below and click on the . Using the button, create a new rule with the following values. Save and close. That's all for setting up the network. Run your guest machine and let its operating system start. Using the appropriate command according to your operating system, start the service. E.g., on a GNU/Linux distribution with systemd. Try to connect to the guest via SSH: In this way, your SSH connection on the port 3987 will be forwarded to your guest machine's SSH port.

0 views
Can ELMA 4 years ago

How to preserve random order of an SQLite table on ephemeral disks

In SQLite databases, generating random values with a seed is not possible as the random function of SQLite does not support a seed value. That's why it may not be possible to write an SQL query to get the same random order every time. But there are solutions to this so you can preserve the random order. This post is one of those solutions. ephemeral: lasting a very short time; short-lived; On ephemeral disks like Heroku's, the files you write to disk will not persist after your application is restarted. So having an SQLite database as a file is pointless. However, if you only want to keep unimportant runtime data in that sqlite database and you have another database as a service to keep actual data in, the simple and "dirty" solution in this article will help you a lot. Assuming the table name is , we add a new column to the table in order to store the random data. We can name it : Then we generate random values for each row: All the rows now have a permanent random values. Using these values, we can preserve the random order and get the data sorted by these random values. You now have a randomly pre-ordered database that can be used even on ephemeral disks. That's all.

0 views