The real world is infinitely more standardized than virtual ones.
Physical space itself enforces a set of laws that make any two objects "compatible", regardless of any established interoperability agreements.
However, in software, there is no such constraint. Two randomly chosen software components are, in general, less composable than a chair and a galaxy.
This is the core reason why we have only been able to achieve interoperability in very specific domains. It's not because we're bad at design or planning --- it's because the space of ideas itself is simply so overwhelmingly large that it takes time and incredible coordination to get anything like pre-built IKEA blocks which fit together naturally.
There are for example products like keycloak. OpenID/OAuth2/Token/Security IdP in a box.
Why there is not ticket system in a box? Yeah, we all know Jira and friends, but these are products not building blocks.
Another angle is: Why the hack does everyone rewrite their microservice foundation. Inbox/Outbox/Worker Queue/Throtteling/tracing/..... What happened to the application servers of the past?
I am a big supporter of that narrative. Why do I need to write more than my dedicated business logic and wire up some UI (and do not get me started on UI space).
IMHO, this can be a real differentiator for the language platforms. Ruby has parts of it, but is still far of.
One reason might be that microservice foundations tend to become frameworks rather than building blocks.
Auth is a building block you can slot into your system and the shape of the hole isn't all that complex. A framework is something you need to slot your system into, and it has a lot of twists and turns so it ends up restricting how you do all the other things.
There are lots of microservice-foundation-related building blocks. E.g. I use a library somewhere that just does circuit-breaking. I slot that into my app rather than the other way around, and if I wanted to replace it, it's a fairly isolated thing. I don't use any frameworks for it though
Frameworks also tend to grow more and more complex as they evolve. If I as a user want a different auth for example, that needs to be supported by the framework. Over time, all things become pluggable and configurable, until at some point it's complex enough that someone restarts the process, or makes an opinionated spin of the framework.
There exist stable solutions for all of this, at least in the java world. Spring Boot and the exhaustive Spring Boot auto configurations are all about this…
> What happened to the application servers of the past?
It‘s no called kubernetes. With the caveat of it not having first class integration for any language, but second class integration for all languages.
I would disagree. Software has certain constraints, or, for the search of a better concept, certain nature. E.g. I myself would say that one side of it is that you always have a very small tool that is supposed to operate on much larger data and can only manage this by doing many-many steps that have to be perfectly orchestrated. This is true to the low-level CPU, but as we move higher we start inventing things and can invent something that is rather different. E.g. on certain levels it may convincingly look that we send a bunch of elves against a bunch of orks. There is a multitude of such concepts, they are vastly different and, of course, incompatible, and it may look that they're all worth each other and what we choose is a matter of preference. Yet among them there is is a single concept that is special, because it is true to the nature of software; or is closer to that truth than others. It is not to be invented, but to be discovered. If we discover it, we will solve the problem of compatibility.
This doesn't match reality as soon as you start working with things that actually deal with laws and taxation. If physical space was standardised, we wouldn't have 366226723 implementations of asset management systems. Also compatibility depends extremely on the laws - is this chair and that chair the same? Not for taxation purposes because one is rented and the other is still depreciating, so you absolutely cannot say you have 2 of those chairs in the office.
> It's not because we're bad at design or planning
No, it is, code writers are bad at design, testing and debuging.
That's why LLMs are so popular: just throw together some code, see if it compiles.
How many (between them) incompatible GTK or QT versions are there ? Why someone has to come every few years with a new graphics library ? Is the JPEG, TIFF or PNG standard changed ? And who thinks that adding layers of abstractions solve problems ?
> We treat every new project like a piece of bespoke furniture, meticulously hand-carving the same legs, joints, and frames that have been built a million times before.
No, we don't. The scale of his own code contributions relative to "Postgres, Timescale and S3), a view layer (a set of Vue web apps) and a business logic layer, consisting of a set of NestJS") is negligible. The price he paid was the time it took to climb a learning curve.
It’s interesting because there is a view that we already have this. You can deploy most of the infrastructure you need to do the kind of thing the author is talking about using a few hundred lines of docker-compose.yml. The author alludes to some of that - grafana, postgres, etc.
What is missing is the glue, or the business logic, or maybe just the custom frontend you can charge a lot of money for ;)
Using any blogging software and stripe or square or paypal you can setup an online store that comes close to the experience that shopify offers - to the customer. But the backend is nowhere near as good as shopify - you are copy-pasting addresses over to usps or ups, tracking orders with notes attached to stripe payments, and god forbid you should want to automate any transactional emails related to it. Oh, and your system falls apart when you start taking orders via ebay or etsy - since you linked so much process to stripe. So the whole thing devolves into a carefully crafted google sheet.
Which is all to say, i agree with the author - but i feel like we are winning the war, it’s just the battlefront moved from deploying backed services to gluing the pieces together.
I work full time as a software engineer, and I also spend time writing code on side projects. I love working on my side projects because it means I get to spend a bunch of time learning the quirks of whatever tooling and frameworks I decide to use. I often develop opinions and personal standards that I bring with me to future side projects (and sometimes $dayjob), although I've never quite stood up all the components of one of my projects "quickly". I'll always find something to tinker with and learn about, which I think is acceptable for my personal projects, desirable even.
As I get more personal projects under my belt, I believe I'll be able to stand up projects more and more quickly, although it's never perfect. Even though I've been using a similar stack among my side projects for a couple of years now, dependencies get outdated. Sometimes you gotta jump to a new major version. Sometimes you wanna try out the "new way".
I like the idea of building up my own personal stack of tooling, frameworks, and patterns that I use, and could even encourage the use of at $dayjob, but for the reasons outlined above, I agree with the conclusion of the article, which is that an "IKEA of software" doesn't exist currently.
For now I'll keep happily tinkering in my side projects. This article was a good read.
The reason is that the underlying technology evolves, and software itself goes through cycles of innovation. Software stays just flexible enough to adapt to the changes and absorb the innovations. It finds this balance "by itself" because software exists in a competitive market, and one where an equivalent of "economies of scale" dominate. Really, it's "economies of attention." All the tools that most people use today are in-use because they were able to (1) adapt to many use-cases while also (2) adapting to changes in their environment and (3) absorb new ideas.
The market learned a long time ago that "high level" text-based languages with libraries are the sweet spot, and many other things about how they are.
People are constantly trying to crystallize higher-level ideas—build some version IKEA or lego, or whatever. It may be that one day this will work. Certainly we see consolidation around strategies as time goes on. But so far, higher-level systems cannot adapt to enough use cases while also adopting to changes in technology and absorbing innovations.
The type of IKEA the author speaks about is Web2.0 software. I was around when we invented all that, and it wasn't that long ago.
Now we're already living through another radical change that software will adapt to—agentic software. If we all used IKEA software, much of it would be thrown out the window in the transition, and we'd lose the value accrued in those systems.
Instead, systems that have been adapting for decades are adapting like they always have. Python, the language of engineering just adds libraries. C/C++/Rust, the anti-IKEA, have new uses. GPU stuff is repurposed for ends never imagined. Nobody uses JQuery anymore.
I don't think the ikea analogy works. Ikea is like a low code solution, which the article also mentions as being too limiting. If you can't design the closet you want in the ikea pax or platsa system, you're also out of luck and will need to get something completely custom built. Especially since the paper honeycomb Ikea furniture doesn't lend itself to modifications.
Probably those SAP, Salesforce ServiceNow folks come somewhat closer?
Like the author says - fleet-tracking system, a bus-ticketing platform or IoT platform share some basically similar requirements. And this is what those SAP types offer - standardised templated versions of workflows.
But slowly, as they get more and more standardised, they start feeling like calcified systems that the end users start hating. Because they are now forced to work as per the templates.
And then the need for customization. And move beyond IKEA-like standardization.
The comparison to artisanal furniture really lands because we do spend way too much time on the basics. It would be nice to just assemble the pieces we need and move on to the interesting work.
I've seen this reusable assembly of software components again and again in the discussions for decades as well as in some older research papers. One side it's good that we check periodically if it can be done with any new tech that is available.
But has any efforts made it far that is universally plug and play? Open source projects had come close. Now those OS projects used by AI coding assistants come close. But no so much plug and play. I think I know why it wasn't happened, and may be can happen.
But I think the idea of abstractions in software engineering, which has guided OOP and even the classes today, is because it was somehow pioneered and may be natural to us to think of the systems that way. Abstractions change from person to person and from teams to teams. Two fleet system components for the same purpose built by two different team will internally behave differently. So universal plug and play for same flows might not work as we think it should be.
Even a low code/no code components, which has a high reusability among completely different projects by different people may introduce overhead that will not work for a said system? because their abstraction of the system isn't the one same as the component was designed? They end up reusing the abstractions of how the system should be around what is already there.
Whats happening now with AI coding assistants, would be the reusability of components software as it should be in this engineering world. It reuses the approaches from any OS project, we can even alternate between the approaches and we get the best out of it (if you know what you're doing) from the different choices and also via our own abstractions of how software should be.
I think we are there, antifragile way of software reusability. We should embrace the choas. A universal metaphorical plug and play for software engineering.
There has to be a central component that allows everything else to be glued together, discoverable parts and configurable bits. It’s really hard to do that generically enough that it works for everything.
I think the closest I’ve seen was OSGi. You had ways to do anything based on modules that were completely decoupled, or you could introduce coupling in a very principled manner as needed. This worked for backend and front end components and the Eclipse IDE was built that way.
Java beans, from the same era, were envisioned to allow the creation of LEGO blocks with which you could build an application. They expected there would be market places where you could buy and sell Beans that did all sorts of things, from pure data representation to Swing UI components like charts and dashboards. I think OSGi built on that idea.
The problem was people thought OSGi was too complex, hard to learn, hard to debug and so on. I believe that was a problem with the execution, not the idea.
Now that I consider it, perhaps emacs is another example of this concept. You have access to use and modify nearly everything with small code snippets. It even has a UI system though that’s a bit outdated and really just text based ! People are trying to create similar systems using more modern technology but it’s very hard to do it well.
Perhaps yet another example is Smalltalk. Have a look at the Glamorous Toolkit (GTK) and its support for UIs that live inside the programming environment itself.
I think this hasn't been yet achieved because components need to interface with each other easily. This requires a standard that all components implement, from which everything can be assembled together.
From that perspective, the idea of microservices is basically "IKEA for software" relying on (primarily) HTTP as the interface between components. But this doesn't really solve it entirely, or very elegantly, because you still need to write the server boilerplate and deploy it, which will be different depending on the programming language being used. Also, your app may require different protocols, so you'll be relying on different standards for different component interactions, therefore the interface is not constant across your entire application.
I believe there's one way we can achieve this reliably, which is via WebAssembly, specifically via the WASM component model [1].
But we need an ecosystem of components, and building an ecosystem that everyone uses and contributes to will likely be the challenging part. I'm actually working on this right now, the platform I've been building (asterai.io) started out as an agent building platform (using WASM components for tool calls) but is evolving into being mostly a registry and (open source) lightweight runtime for WASM components.
The idea of using WASM to solve for this is very simple in concept. Think about a tool like Docker, but instead of images you have an "environment" which is a file that defines a set of WASM components and ENV vars. That's basically it, you can then run that environment which will run all components that are executable. Components can call each other dynamically, so a component can act as a library as well, or it may be only a library and not an executable. A component can also only define an interface (which other components can implement), rather than contain any implementation code.
This architecture solves the main challenges that stop "IKEA for software" from being a reality:
1. You can write WASM components in any programming language.
2. You can add components to your environment/app with a single click, and interfacing is standardised via WIT [2].
3. Deploying it is the same process for any component or app.
Of course, it still would require significant WASM adoption to become a reality. But I think WASM is the best bet for this.
I feel like docker has (somewhat incidentally) basically provided this? It's a few lines of yaml to spin up postgres, redis, nginx, and then you just slot your bespoke javascript in between...
(the box where "bespoke javascript" goes could obviously use some work in this picture)
>Software would be decoupled from its creator, very much in the same way writing decouples information from the writer or video decouples content creators from their content.
"top-to-bottom" needs a better name that doesn't omit the "reverence for the artisans" (aka supply chain nous) that's in IKEA's DNA
Twilio's Flex isn't far off this concept. For building simple voice and text message applications, it does quite a lot out of the box. It's their full telephony stack but presented at a much higher level of abstraction.
Back in the 80's with OOP starting to take off, it was IC for Software, then Components (famously successful during the 90's on Windows world), now it is IKEA.
I was a FileMaker developer and I wouldn't say that. The way to develop on FileMaker is to be an expert of workarounds that would stun an outsider.
It was a good data processing tool in earlier versions. I always liked its splash screen picture of v4: a file cabinet of the size of an office building. It was a very good metaphor. But even then there always were rather hard and strange limitations of what you can do and the main skill you developed was to invent a disgustingly clever way to overcome them.
(Example: sometimes people want to sort entries in a pop-up menu in a custom way, but there is no built-in capability to do that: the entries are always sorted alphabetically. So one of "solutions" is that: you give each entry a prefix that consists of zero-width spaces. First entry gets one space, second two spaces, and so on. As a result when they are sorted, they are sorted by that prefix, but since the prefix is invisible, the user only sees the entries themselves. If you are careful and lucky and the users aren't too inventive these spaces won't contaminate the data, although generally contaminated data is the normal state of most data in FileMaker.)
> If you are careful and lucky and the users aren't too inventive these spaces won't contaminate the data, although generally contaminated data is the normal state of most data in FileMaker.
Thanks for the example. I have a copy of v19 I've been meaning to work with. Have such workarounds become less necessary?
There is still a lot. E.g. that thing about sorting a pop-up is there since v4 at least and, of course, there were requests to improve this, but in v19 it is still the same. FileMaker was always mostly concerned about adding more features they can highlight and bugfixes are not that exciting. They do fix some, but there is a lot of old baggage.
It is not outright bad, of course. It is very easy to create a relational-like database with several tables and give them decent layouts that would be do a good job of displaying the data and jump between tables. These layouts are directly searchable, you can sort, select and, as they say, "massage" the data. All this doesn't even require much scripting, you'll get rather far with simple single-action buttons. If you use it as a personal tool you can get very good mileage out of it.
But as you go deeper you'll see a rather strange mix of things that do not really work well together. There are calculated fields; not a bad concept in general, but in FileMaker this is the only way to search or sort by a calculated criteria: you have to turn it into a field. As a result a nicely looking table starts to get populated by calculated fields for various purposes. Those that span to related tables are marked as "unstored". These are going to be slower, especially over a network. In fact they can be so slow that some developers go out of their skin to imitate such calculations: they just add a normal field and try to make sure it always stays updated as the data change. Which is not that simple if there is network and multiple users.
Then there are scripts. Again, a good thing, but their original idea was that they will simply resemble user actions: go to that layout, enter that field, etc. This means there is some context, similar to what the user sees, and what is not visible from that context won't be visible to the script. E.g. you cannot just read a field in some table even if you know the ID of the record: you have to go there and find that record (enter find mode, set fields, perform find) or go elsewhere and actually find that record that can see that record over a relationship.
There is SQL, right. But that SQL is given to you only in the form of 'ExecuteSQL()' function and can only read data. (There is also a script step, but this one is about connecting to external data sources.) If you want to change the data, you must use a plug-in: plug-ins have access to SQL without this limitation. (But do not have access to anything else in FileMaker; they are basically just additional functions or script steps you can use in your expressions or scripts: they receive data, send back a result, can evaluate an expression or do SQL and that's it.) You can use placeholders for data ('WHERE "abc" = ?') but there is no placeholders for table and field names. FileMaker is rather smart when it comes to names: it tracks them by internal IDs, very much like original Mac tracked files, so when you rename a field, nothing breaks. But not in SQL expressions; here they are entered as text and if you rename a field, the SQL will fail. So if you want robust SQL, you need to come up with a solution that supplies the field names dynamically. It is doable, but, of course, complicates things and it will be your own house rule.
Then there are "custom functions". You see, you can create your own scripts, you can write expressions that are part of script steps or fields, but you cannot call a script as a part of such an expression. So there are "custom functions" that fill this niche. Basically they are expressions with names and parameters that you can create. Again, they could be useful, but they are linked to a single file and in a multi-file app you have to copy them between the files yourself. There is no automation. Of course, you'll soon end up with several subtly different versions in different files.
And this continues. There is a JSON parser, but JSON is passed around as text and the parser works by evaluating a single path-like expression, which means it parses that JSON each time. It is not fast. The layout features are relatively limited, but there is a web view; some people write whole widgets in that web view. (E.g. a calendar.) There is some integration: you can call a script in the web viewer and a web viewer can call a FileMaker script. But now your database contains HTML and JavaScript for what you're going to display. There is no drag-and-drop; but there are "container" (binary) fields and you can drag content between them and there is an auto-enter feature and I already forgot how it was done, but some people built limited drag-and-drop this way. There are variables, two kinds, some are script-wide, some are application-wide (global), and they can work like arrays. Related data over network get very slow, so people invented such thing as "virtual tables": you create a single table that has no data, only unstored calculations that refer to cells in those global variables and when you display such a table, nothing gets transferred over the network, everything is read from the variables that are in memory; as a result it works pretty fast. But now you have to devise a strategy to fill those variables yourself and the data you see is not real anymore (cannot be changed directly), so you need to add another way. Their network layer, by the way, was created long ago and judging from the files in the app directory it uses Corba, so, apparently, they just work with remote tables as with local tables, without trying to account for latency and such. So complex interactions (like sorting) over a network tend to get very slow. To sort of solve that there is a way to run a script on the server. But it is not integrated too well and I remember building a system for myself to be able to specify this as a general preference and not hardcode at the script level as it is normally done.
Their expressions engine has a somewhat strange but convenient way to group certain things into an array-like structure; e.g. if you need to substitute multiple strings, you can do this with a single call to 'Substitute()' and pass it an array of substitutions. But this is not available to plug-ins; here you can only receive a one-dimensional array of parameters. But in the plug-in you can specify you can process any number of parameters. With custom functions you cannot: here the number of parameters is fixed. Scripts too can get parameters, but here the situation is even worse: they get a single text string and if you want to pass multiple bits of data, you have to devise your own mechanism for that. Nowadays most pass JSON; but JSON parser appeared only recently, does not understand FileMaker data types too well and is rather tedious to build in FileMaker, as you have to specify JSON type for each property or write literal JSON as a string and escape every quote. The limitation is somewhat lifted if you call a script via Web API; here you can, e.g. set script variables for the target script. But that Web API is only available on the server, I think; there is some Data API you can call from inside, but I'm not sure what it does, by that time I mostly switched to other things. And Web API is limited, you subscribe to a certain number of calls. There is an older unlimited XML-based API, but it cannot do this for scripts.
And so on. They have lots of utility, they have servers for three platforms, they have desktop and mobile clients, access to specific Apple APIs, Web API, etc., but inside all of this is like a Frankenstein that was sewed together from totally different parts.
I think you’re looking for Laravel, with its many many official packages and the thousands of community packages, which are often full features including an optional frontend for it. :-)
People have been building low code or modular software components for decades - at all layers of the software stack- eg compilers, languages, frameworks, libraries, components, platforms, extensions, you name it..
Just some examples over the years
Odoo, flutter, eclipse, enterprise java beans, wpf, n8n, package manager based approaches, microsoft power apps, app builders, supabase,
Which one is ikea?
software is an abstract quantity dependent on nothing but human whim, tech progress, and popularity and cost to build, and changes year over year - you cannot build a ikea like collection or library of components when everything changes year over year like a house or sand and no one agrees on anything
Now if you could pick and choose a few things and have a CRUD app maybe -but most people have reqs that need specific workflowsand that breaks the analogy
To me, AI iteration is biggest revolution of the past few years- ai might allow fuzzy modifications to largely built stacks to work something like an ikea style approach
Yes AI solves this issue in the best way. It's really good at adding features that have already been built many times, but in a way that's customized to your needs and code style.
The real world is infinitely more standardized than virtual ones.
Physical space itself enforces a set of laws that make any two objects "compatible", regardless of any established interoperability agreements.
However, in software, there is no such constraint. Two randomly chosen software components are, in general, less composable than a chair and a galaxy.
This is the core reason why we have only been able to achieve interoperability in very specific domains. It's not because we're bad at design or planning --- it's because the space of ideas itself is simply so overwhelmingly large that it takes time and incredible coordination to get anything like pre-built IKEA blocks which fit together naturally.
Yeah, but it does not have to.
There are for example products like keycloak. OpenID/OAuth2/Token/Security IdP in a box.
Why there is not ticket system in a box? Yeah, we all know Jira and friends, but these are products not building blocks.
Another angle is: Why the hack does everyone rewrite their microservice foundation. Inbox/Outbox/Worker Queue/Throtteling/tracing/..... What happened to the application servers of the past?
I am a big supporter of that narrative. Why do I need to write more than my dedicated business logic and wire up some UI (and do not get me started on UI space).
IMHO, this can be a real differentiator for the language platforms. Ruby has parts of it, but is still far of.
One reason might be that microservice foundations tend to become frameworks rather than building blocks.
Auth is a building block you can slot into your system and the shape of the hole isn't all that complex. A framework is something you need to slot your system into, and it has a lot of twists and turns so it ends up restricting how you do all the other things.
There are lots of microservice-foundation-related building blocks. E.g. I use a library somewhere that just does circuit-breaking. I slot that into my app rather than the other way around, and if I wanted to replace it, it's a fairly isolated thing. I don't use any frameworks for it though
Frameworks also tend to grow more and more complex as they evolve. If I as a user want a different auth for example, that needs to be supported by the framework. Over time, all things become pluggable and configurable, until at some point it's complex enough that someone restarts the process, or makes an opinionated spin of the framework.
> Inbox/Outbox/Worker Queue/Throtteling/tracing/
There exist stable solutions for all of this, at least in the java world. Spring Boot and the exhaustive Spring Boot auto configurations are all about this…
> What happened to the application servers of the past?
It‘s no called kubernetes. With the caveat of it not having first class integration for any language, but second class integration for all languages.
I would disagree. Software has certain constraints, or, for the search of a better concept, certain nature. E.g. I myself would say that one side of it is that you always have a very small tool that is supposed to operate on much larger data and can only manage this by doing many-many steps that have to be perfectly orchestrated. This is true to the low-level CPU, but as we move higher we start inventing things and can invent something that is rather different. E.g. on certain levels it may convincingly look that we send a bunch of elves against a bunch of orks. There is a multitude of such concepts, they are vastly different and, of course, incompatible, and it may look that they're all worth each other and what we choose is a matter of preference. Yet among them there is is a single concept that is special, because it is true to the nature of software; or is closer to that truth than others. It is not to be invented, but to be discovered. If we discover it, we will solve the problem of compatibility.
This doesn't match reality as soon as you start working with things that actually deal with laws and taxation. If physical space was standardised, we wouldn't have 366226723 implementations of asset management systems. Also compatibility depends extremely on the laws - is this chair and that chair the same? Not for taxation purposes because one is rented and the other is still depreciating, so you absolutely cannot say you have 2 of those chairs in the office.
I think in a way the OS is the "physical space" of software. This matches common composable interfaces like lines of text, files or http calls (rest).
> It's not because we're bad at design or planning
No, it is, code writers are bad at design, testing and debuging. That's why LLMs are so popular: just throw together some code, see if it compiles.
How many (between them) incompatible GTK or QT versions are there ? Why someone has to come every few years with a new graphics library ? Is the JPEG, TIFF or PNG standard changed ? And who thinks that adding layers of abstractions solve problems ?
> We treat every new project like a piece of bespoke furniture, meticulously hand-carving the same legs, joints, and frames that have been built a million times before.
No, we don't. The scale of his own code contributions relative to "Postgres, Timescale and S3), a view layer (a set of Vue web apps) and a business logic layer, consisting of a set of NestJS") is negligible. The price he paid was the time it took to climb a learning curve.
It’s interesting because there is a view that we already have this. You can deploy most of the infrastructure you need to do the kind of thing the author is talking about using a few hundred lines of docker-compose.yml. The author alludes to some of that - grafana, postgres, etc.
What is missing is the glue, or the business logic, or maybe just the custom frontend you can charge a lot of money for ;)
Using any blogging software and stripe or square or paypal you can setup an online store that comes close to the experience that shopify offers - to the customer. But the backend is nowhere near as good as shopify - you are copy-pasting addresses over to usps or ups, tracking orders with notes attached to stripe payments, and god forbid you should want to automate any transactional emails related to it. Oh, and your system falls apart when you start taking orders via ebay or etsy - since you linked so much process to stripe. So the whole thing devolves into a carefully crafted google sheet.
Which is all to say, i agree with the author - but i feel like we are winning the war, it’s just the battlefront moved from deploying backed services to gluing the pieces together.
This reads like a junior developer rant.
His dependency count is in hundreds. 90% of his code is glue code between those dependencies. Yet he thinks he is the one that created something.
I work full time as a software engineer, and I also spend time writing code on side projects. I love working on my side projects because it means I get to spend a bunch of time learning the quirks of whatever tooling and frameworks I decide to use. I often develop opinions and personal standards that I bring with me to future side projects (and sometimes $dayjob), although I've never quite stood up all the components of one of my projects "quickly". I'll always find something to tinker with and learn about, which I think is acceptable for my personal projects, desirable even.
As I get more personal projects under my belt, I believe I'll be able to stand up projects more and more quickly, although it's never perfect. Even though I've been using a similar stack among my side projects for a couple of years now, dependencies get outdated. Sometimes you gotta jump to a new major version. Sometimes you wanna try out the "new way".
I like the idea of building up my own personal stack of tooling, frameworks, and patterns that I use, and could even encourage the use of at $dayjob, but for the reasons outlined above, I agree with the conclusion of the article, which is that an "IKEA of software" doesn't exist currently.
For now I'll keep happily tinkering in my side projects. This article was a good read.
The reason is that the underlying technology evolves, and software itself goes through cycles of innovation. Software stays just flexible enough to adapt to the changes and absorb the innovations. It finds this balance "by itself" because software exists in a competitive market, and one where an equivalent of "economies of scale" dominate. Really, it's "economies of attention." All the tools that most people use today are in-use because they were able to (1) adapt to many use-cases while also (2) adapting to changes in their environment and (3) absorb new ideas.
The market learned a long time ago that "high level" text-based languages with libraries are the sweet spot, and many other things about how they are.
People are constantly trying to crystallize higher-level ideas—build some version IKEA or lego, or whatever. It may be that one day this will work. Certainly we see consolidation around strategies as time goes on. But so far, higher-level systems cannot adapt to enough use cases while also adopting to changes in technology and absorbing innovations.
The type of IKEA the author speaks about is Web2.0 software. I was around when we invented all that, and it wasn't that long ago.
Now we're already living through another radical change that software will adapt to—agentic software. If we all used IKEA software, much of it would be thrown out the window in the transition, and we'd lose the value accrued in those systems.
Instead, systems that have been adapting for decades are adapting like they always have. Python, the language of engineering just adds libraries. C/C++/Rust, the anti-IKEA, have new uses. GPU stuff is repurposed for ends never imagined. Nobody uses JQuery anymore.
It's a brutal state of affairs, really. Werner Herzog says it best: https://www.youtube.com/watch?v=pF5xBtaL3YI
I don't think the ikea analogy works. Ikea is like a low code solution, which the article also mentions as being too limiting. If you can't design the closet you want in the ikea pax or platsa system, you're also out of luck and will need to get something completely custom built. Especially since the paper honeycomb Ikea furniture doesn't lend itself to modifications.
Analogy breaks instantly.
Let’s not ignore, SAP, Salesforce and dozens of other “you can run your company on that” software suites that have exactly what author proposes.
Oh yeah but you want that only don’t want to pay….
Also, those things are horrible. You pay in money, and in sanity, and in productivity
Probably those SAP, Salesforce ServiceNow folks come somewhat closer?
Like the author says - fleet-tracking system, a bus-ticketing platform or IoT platform share some basically similar requirements. And this is what those SAP types offer - standardised templated versions of workflows.
But slowly, as they get more and more standardised, they start feeling like calcified systems that the end users start hating. Because they are now forced to work as per the templates.
And then the need for customization. And move beyond IKEA-like standardization.
But I see the allure of the idea.
The systems you mentioned are one size fits all.
IKEA things are designed to be simple and cost effective. And it's a vast collection of different things.
At least this is the difference I see.
You make it sound more like unix cli tools. I guess that makes some sense.
The comparison to artisanal furniture really lands because we do spend way too much time on the basics. It would be nice to just assemble the pieces we need and move on to the interesting work.
I'm not too sure about this take. The larger code rewrite issue is constantly trying to be solved, which is somehow making the problem worse.
In another view, standard libraries do a pretty good job.
Just to be sure... What is the interesting work you talk about?
I've seen this reusable assembly of software components again and again in the discussions for decades as well as in some older research papers. One side it's good that we check periodically if it can be done with any new tech that is available.
But has any efforts made it far that is universally plug and play? Open source projects had come close. Now those OS projects used by AI coding assistants come close. But no so much plug and play. I think I know why it wasn't happened, and may be can happen.
But I think the idea of abstractions in software engineering, which has guided OOP and even the classes today, is because it was somehow pioneered and may be natural to us to think of the systems that way. Abstractions change from person to person and from teams to teams. Two fleet system components for the same purpose built by two different team will internally behave differently. So universal plug and play for same flows might not work as we think it should be.
Even a low code/no code components, which has a high reusability among completely different projects by different people may introduce overhead that will not work for a said system? because their abstraction of the system isn't the one same as the component was designed? They end up reusing the abstractions of how the system should be around what is already there.
Whats happening now with AI coding assistants, would be the reusability of components software as it should be in this engineering world. It reuses the approaches from any OS project, we can even alternate between the approaches and we get the best out of it (if you know what you're doing) from the different choices and also via our own abstractions of how software should be.
I think we are there, antifragile way of software reusability. We should embrace the choas. A universal metaphorical plug and play for software engineering.
There has to be a central component that allows everything else to be glued together, discoverable parts and configurable bits. It’s really hard to do that generically enough that it works for everything. I think the closest I’ve seen was OSGi. You had ways to do anything based on modules that were completely decoupled, or you could introduce coupling in a very principled manner as needed. This worked for backend and front end components and the Eclipse IDE was built that way.
Java beans, from the same era, were envisioned to allow the creation of LEGO blocks with which you could build an application. They expected there would be market places where you could buy and sell Beans that did all sorts of things, from pure data representation to Swing UI components like charts and dashboards. I think OSGi built on that idea.
The problem was people thought OSGi was too complex, hard to learn, hard to debug and so on. I believe that was a problem with the execution, not the idea.
Now that I consider it, perhaps emacs is another example of this concept. You have access to use and modify nearly everything with small code snippets. It even has a UI system though that’s a bit outdated and really just text based ! People are trying to create similar systems using more modern technology but it’s very hard to do it well.
Perhaps yet another example is Smalltalk. Have a look at the Glamorous Toolkit (GTK) and its support for UIs that live inside the programming environment itself.
I think this hasn't been yet achieved because components need to interface with each other easily. This requires a standard that all components implement, from which everything can be assembled together.
From that perspective, the idea of microservices is basically "IKEA for software" relying on (primarily) HTTP as the interface between components. But this doesn't really solve it entirely, or very elegantly, because you still need to write the server boilerplate and deploy it, which will be different depending on the programming language being used. Also, your app may require different protocols, so you'll be relying on different standards for different component interactions, therefore the interface is not constant across your entire application.
I believe there's one way we can achieve this reliably, which is via WebAssembly, specifically via the WASM component model [1].
But we need an ecosystem of components, and building an ecosystem that everyone uses and contributes to will likely be the challenging part. I'm actually working on this right now, the platform I've been building (asterai.io) started out as an agent building platform (using WASM components for tool calls) but is evolving into being mostly a registry and (open source) lightweight runtime for WASM components.
The idea of using WASM to solve for this is very simple in concept. Think about a tool like Docker, but instead of images you have an "environment" which is a file that defines a set of WASM components and ENV vars. That's basically it, you can then run that environment which will run all components that are executable. Components can call each other dynamically, so a component can act as a library as well, or it may be only a library and not an executable. A component can also only define an interface (which other components can implement), rather than contain any implementation code.
This architecture solves the main challenges that stop "IKEA for software" from being a reality: 1. You can write WASM components in any programming language. 2. You can add components to your environment/app with a single click, and interfacing is standardised via WIT [2]. 3. Deploying it is the same process for any component or app.
Of course, it still would require significant WASM adoption to become a reality. But I think WASM is the best bet for this.
[1]: https://component-model.bytecodealliance.org [2]: https://component-model.bytecodealliance.org/design/wit.html
I feel like docker has (somewhat incidentally) basically provided this? It's a few lines of yaml to spin up postgres, redis, nginx, and then you just slot your bespoke javascript in between...
(the box where "bespoke javascript" goes could obviously use some work in this picture)
>Software would be decoupled from its creator, very much in the same way writing decouples information from the writer or video decouples content creators from their content.
"top-to-bottom" needs a better name that doesn't omit the "reverence for the artisans" (aka supply chain nous) that's in IKEA's DNA
https://archive.ph/2020.05.30-154951/https://hbr.org/2013/09...
(We can argue that Toyota or IKEA are both losing out to China in other threads)
(Much aware of the SW-arch context btw)
This could motivate an algebraic-theological def for Love:
"The Creator cannot be decoupled from His [Software (sorry*)] Creations"
*As in Lean4 syntax, perhaps
https://proofassistants.stackexchange.com/questions/4541/lea...
Twilio's Flex isn't far off this concept. For building simple voice and text message applications, it does quite a lot out of the box. It's their full telephony stack but presented at a much higher level of abstraction.
What goes around comes around.
Back in the 80's with OOP starting to take off, it was IC for Software, then Components (famously successful during the 90's on Windows world), now it is IKEA.
This is basically what most cloud platforms provide. Cloudflare has some of the most high level components/building blocks imo.
Reminds me of some frameworks that did the top to bottom approach.
Yii (PHP) had a code generator that created all the CRUD logic for you. After generation, you could remove the code that you didn't need.
Hypercard is the IKEA for software.
Do you have a more current example
Sticking with Apple, FileMaker
I was a FileMaker developer and I wouldn't say that. The way to develop on FileMaker is to be an expert of workarounds that would stun an outsider.
It was a good data processing tool in earlier versions. I always liked its splash screen picture of v4: a file cabinet of the size of an office building. It was a very good metaphor. But even then there always were rather hard and strange limitations of what you can do and the main skill you developed was to invent a disgustingly clever way to overcome them.
(Example: sometimes people want to sort entries in a pop-up menu in a custom way, but there is no built-in capability to do that: the entries are always sorted alphabetically. So one of "solutions" is that: you give each entry a prefix that consists of zero-width spaces. First entry gets one space, second two spaces, and so on. As a result when they are sorted, they are sorted by that prefix, but since the prefix is invisible, the user only sees the entries themselves. If you are careful and lucky and the users aren't too inventive these spaces won't contaminate the data, although generally contaminated data is the normal state of most data in FileMaker.)
> If you are careful and lucky and the users aren't too inventive these spaces won't contaminate the data, although generally contaminated data is the normal state of most data in FileMaker.
Thanks for the example. I have a copy of v19 I've been meaning to work with. Have such workarounds become less necessary?
There is still a lot. E.g. that thing about sorting a pop-up is there since v4 at least and, of course, there were requests to improve this, but in v19 it is still the same. FileMaker was always mostly concerned about adding more features they can highlight and bugfixes are not that exciting. They do fix some, but there is a lot of old baggage.
It is not outright bad, of course. It is very easy to create a relational-like database with several tables and give them decent layouts that would be do a good job of displaying the data and jump between tables. These layouts are directly searchable, you can sort, select and, as they say, "massage" the data. All this doesn't even require much scripting, you'll get rather far with simple single-action buttons. If you use it as a personal tool you can get very good mileage out of it.
But as you go deeper you'll see a rather strange mix of things that do not really work well together. There are calculated fields; not a bad concept in general, but in FileMaker this is the only way to search or sort by a calculated criteria: you have to turn it into a field. As a result a nicely looking table starts to get populated by calculated fields for various purposes. Those that span to related tables are marked as "unstored". These are going to be slower, especially over a network. In fact they can be so slow that some developers go out of their skin to imitate such calculations: they just add a normal field and try to make sure it always stays updated as the data change. Which is not that simple if there is network and multiple users.
Then there are scripts. Again, a good thing, but their original idea was that they will simply resemble user actions: go to that layout, enter that field, etc. This means there is some context, similar to what the user sees, and what is not visible from that context won't be visible to the script. E.g. you cannot just read a field in some table even if you know the ID of the record: you have to go there and find that record (enter find mode, set fields, perform find) or go elsewhere and actually find that record that can see that record over a relationship.
There is SQL, right. But that SQL is given to you only in the form of 'ExecuteSQL()' function and can only read data. (There is also a script step, but this one is about connecting to external data sources.) If you want to change the data, you must use a plug-in: plug-ins have access to SQL without this limitation. (But do not have access to anything else in FileMaker; they are basically just additional functions or script steps you can use in your expressions or scripts: they receive data, send back a result, can evaluate an expression or do SQL and that's it.) You can use placeholders for data ('WHERE "abc" = ?') but there is no placeholders for table and field names. FileMaker is rather smart when it comes to names: it tracks them by internal IDs, very much like original Mac tracked files, so when you rename a field, nothing breaks. But not in SQL expressions; here they are entered as text and if you rename a field, the SQL will fail. So if you want robust SQL, you need to come up with a solution that supplies the field names dynamically. It is doable, but, of course, complicates things and it will be your own house rule.
Then there are "custom functions". You see, you can create your own scripts, you can write expressions that are part of script steps or fields, but you cannot call a script as a part of such an expression. So there are "custom functions" that fill this niche. Basically they are expressions with names and parameters that you can create. Again, they could be useful, but they are linked to a single file and in a multi-file app you have to copy them between the files yourself. There is no automation. Of course, you'll soon end up with several subtly different versions in different files.
And this continues. There is a JSON parser, but JSON is passed around as text and the parser works by evaluating a single path-like expression, which means it parses that JSON each time. It is not fast. The layout features are relatively limited, but there is a web view; some people write whole widgets in that web view. (E.g. a calendar.) There is some integration: you can call a script in the web viewer and a web viewer can call a FileMaker script. But now your database contains HTML and JavaScript for what you're going to display. There is no drag-and-drop; but there are "container" (binary) fields and you can drag content between them and there is an auto-enter feature and I already forgot how it was done, but some people built limited drag-and-drop this way. There are variables, two kinds, some are script-wide, some are application-wide (global), and they can work like arrays. Related data over network get very slow, so people invented such thing as "virtual tables": you create a single table that has no data, only unstored calculations that refer to cells in those global variables and when you display such a table, nothing gets transferred over the network, everything is read from the variables that are in memory; as a result it works pretty fast. But now you have to devise a strategy to fill those variables yourself and the data you see is not real anymore (cannot be changed directly), so you need to add another way. Their network layer, by the way, was created long ago and judging from the files in the app directory it uses Corba, so, apparently, they just work with remote tables as with local tables, without trying to account for latency and such. So complex interactions (like sorting) over a network tend to get very slow. To sort of solve that there is a way to run a script on the server. But it is not integrated too well and I remember building a system for myself to be able to specify this as a general preference and not hardcode at the script level as it is normally done.
Their expressions engine has a somewhat strange but convenient way to group certain things into an array-like structure; e.g. if you need to substitute multiple strings, you can do this with a single call to 'Substitute()' and pass it an array of substitutions. But this is not available to plug-ins; here you can only receive a one-dimensional array of parameters. But in the plug-in you can specify you can process any number of parameters. With custom functions you cannot: here the number of parameters is fixed. Scripts too can get parameters, but here the situation is even worse: they get a single text string and if you want to pass multiple bits of data, you have to devise your own mechanism for that. Nowadays most pass JSON; but JSON parser appeared only recently, does not understand FileMaker data types too well and is rather tedious to build in FileMaker, as you have to specify JSON type for each property or write literal JSON as a string and escape every quote. The limitation is somewhat lifted if you call a script via Web API; here you can, e.g. set script variables for the target script. But that Web API is only available on the server, I think; there is some Data API you can call from inside, but I'm not sure what it does, by that time I mostly switched to other things. And Web API is limited, you subscribe to a certain number of calls. There is an older unlimited XML-based API, but it cannot do this for scripts.
And so on. They have lots of utility, they have servers for three platforms, they have desktop and mobile clients, access to specific Apple APIs, Web API, etc., but inside all of this is like a Frankenstein that was sewed together from totally different parts.
I think you’re looking for Laravel, with its many many official packages and the thousands of community packages, which are often full features including an optional frontend for it. :-)
People have been building low code or modular software components for decades - at all layers of the software stack- eg compilers, languages, frameworks, libraries, components, platforms, extensions, you name it..
Just some examples over the years
Odoo, flutter, eclipse, enterprise java beans, wpf, n8n, package manager based approaches, microsoft power apps, app builders, supabase,
Which one is ikea?
software is an abstract quantity dependent on nothing but human whim, tech progress, and popularity and cost to build, and changes year over year - you cannot build a ikea like collection or library of components when everything changes year over year like a house or sand and no one agrees on anything
Now if you could pick and choose a few things and have a CRUD app maybe -but most people have reqs that need specific workflowsand that breaks the analogy
To me, AI iteration is biggest revolution of the past few years- ai might allow fuzzy modifications to largely built stacks to work something like an ikea style approach
Yes AI solves this issue in the best way. It's really good at adding features that have already been built many times, but in a way that's customized to your needs and code style.
It exists, it’s called SAP, Workday, Salesforce etc…
That isn't ikea for software. That is as if Ikea only made chairs, and primarily focused on one type of chair, like wingbacks.
[dead]