You Want Technology with Warts

(entropicthoughts.com)

99 points | by tartoran 2 days ago ago

60 comments

  • lmm 2 days ago ago

    Warts are the product of caring about backwards compatibility with initial design mistakes. So I'm not sure they're a reliable heuristic for better choice of technology. E.g. YAML is definitely much wartier than JSON, but I think the latter is a much better choice for long-term use than the former.

    • gwd 2 days ago ago

      I don't think the claim is "More warts is better". I think the claim is, "A technology with at least a few warts is an indication of longevity".

      Basically, if there are no obvious warts, then either:

      1. The designers magically hit upon exactly the right needed solution the very first time, every time, or

      2. The designers regularly throw away warty interfaces in favor of cleaner ones.

      #1 is possible, but #2 is far more likely.

      • cosmic_cheese 2 days ago ago

        It’s probably possible to minimize warts without negatively impacting other aspects of the software. It just isn’t done very often, probably because that requires either a full feature freeze (allowing all resources to go towards disposing of the figurative bathwater while keeping the baby). At minimum a pointedly slow, judicious approach to feature additions that helps designers and devs to get more right the first time and only ship fully baked products to users is required.

        No, everything today is about endless frenzied stacking of features as high as possible, totally ignoring how askew the tower is sitting as a result. Just prop it up again for the 50th time and above all, ship faster!

      • massysett 2 days ago ago

        Another possibility is that the designers carefully examined predecessors to learn from their mistakes, and went through a long refinement period before stabilizing the design. This is Common Lisp.

        • jodrellblank 2 days ago ago

          Nothing warty about functions named "car, cdr, caar, cadr, cdar, cddr, caaar, caadr, cadar, caddr, cdaar, cdadr, cddar, cdddr, caaaar, caaadr, caadar, caaddr, cadaar, cadadr, caddar, cadddr, cdaaar, cdaadr, cdadar, cdaddr, cddaar, cddadr, cdddar, cddddr"[1]

          a domain specific language for walking into nested lists based on an implementation detail of how linked list pointers and values were packed into a single 36-bit memory location on an IBM 704 computer from the 1950s.

          [1] https://franz.com/support/documentation/ansicl/dictentr/carc...

          [2] https://en.wikipedia.org/wiki/CAR_and_CDR

        • neutronicus 2 days ago ago

          Common Lisp is basically a constellation of warts that people put up with in exchange for its macro system and REPL-driven development.

        • lieks 2 days ago ago

          Common Lisp does have plenty of warts.

        • bluGill 2 days ago ago

          You can not have the old warts, but your correction may itself be a wart. Sometimes you learn the hard way why the last version was that wey

        • Blackthorn 2 days ago ago

          Dude, Common Lisp has so many warts that it makes fairy tale witches envious.

    • andrewflnr 2 days ago ago

      JSON also has a few warts that aren't going away exactly due to backwards compatibility. So I think that's actually a pretty good illustration of the point.

  • behnamoh 2 days ago ago

    > There’s a chart in the presentation that shows how environmental churn and api deprecation leads desktop applications to have an expected lifetime of maybe a decade, and phone apps closer to a couple of years. On the other hand, simple web pages have worked unmodified for over 40 years! That’s a good reason to default to the web as a technology.

    isn't it the opposite? Just take a look at any web app and tell me if it wasn't rewritten at some point due to the rapid shift in web frameworks and hype cycles. heck, even HN was rewritten afaik.

    meanwhile i have Matlab that has remained literally the same for over a decade.

    • ehnto 2 days ago ago

      It said simple web pages, not web apps. Web apps that use APIs are basically in the same boat as a phone app. Constantly moving API target and platform requirements ever changing.

      • snowwrestler 2 days ago ago

        Ok but what is serving the simple web page? Do we think that software has been patched or rewritten over the last few decades?

        I feel like there is a category error here where people think a website is the HTML that lands in your browser. When actually it is also the technology that received your request and sent the HTML. Even if it’s the same “flat” HTML sitting on the server, a web application is still sending it. And yes, those require maintenance.

        • ehnto 2 days ago ago

          Of course, but some technologies are more mature and stable, and maintaining those amounts to set and forget security updates, not rewriting functionality to keep up with changing features etc.

          It's not a dichotomy though. If you build with cutting edge platforms, you'll suffer churn as things change underneath you. It's a choice you should make with awareness, but there's no right answer.

          I prefer stable mature platforms for professional settings, because you can focus on the business problems and not the technology problems. It's the more fiscally responsible choice as well, given lower platform churn means less work that has no direct business value.

      • cylemons 2 days ago ago

        Just like simple webpages, simple desktop and mobile apps last a long time too. It is only when working on complex applications that use cutting edge apis that churn is a problem

        • swiftcoder 2 days ago ago

          Both Apple and Google start stripping apps out of the store once they stop shipping updates. And Apple in particular deprecates APIs like they are going out of business.

          Not to mention things like the whole 32->64bit transition that dropped all previous iOS apps (and on the MacOS side of things, we've had 4x of those in the past 25 years - classic->OSX, powerpc->intel, 32->64bit, intel->M series)

          • cosmic_cheese 2 days ago ago

            Google is really the more strict of the two here. On the App Store you can get away with shipping an update from an old version of Xcode once every couple of years, while Google will boot you if you don’t increase your SDK target (which entails updating your whole toolchain and probably several dependencies too) once every year. Deprecations are pretty soft and many of the APIs deprecated well over a decade ago still technically function.

            The major platform transitions are harder breaks but are pretty rare. We’re not going to another architecture shift or bit increase for a long time.

        • kqr 2 days ago ago

          > mobile apps last a long time too

          Really? How come half of the apps I've used in the past are listed in the app store as "not compatible with this device, but they can be installed on $device_gone_to_rest_in_drawer_14_years_ago"?

          I'm genuinely curious! I thought it was aggressive deprecation of mobile OS APIs that made old apps no longer work on new phones.

          • cylemons 5 hours ago ago

            Have you tried installing APKs directly?

      • swiftcoder 2 days ago ago

        Have we actually had any big webapi deprecations?

        I mean, I guess the death of Flash / Java Applets / ActiveX / etc count, but in the javascript world, doesn't feel like we've actually had that many breaking changes

        • pcthrowaway 2 days ago ago

          Sure we have. Off the top of my head, and relevant to this article, the websql api was deprecated after 2-3 years of support.

          • swiftcoder 2 days ago ago

            I guess so. Since WebSQL got sniped well before standardisation (and Mozilla never implemented it in the first place), I wasn’t really counting it

          • afiori 2 days ago ago

            More like it was moved to a library

    • ikurei 2 days ago ago

      Webapps are rewritten because a developer wanted to use the new shiny, or someone was convinced that everything will be better with the newer frameworks everyone is using. Also, it often goes hand in hand with giving it a more modern look-and-feel.

      But the point is not whether webapps are rewritten, but whether they have to be rewritten. I know some old enterprise webapps made with PHP about 10 years ago that are still working fine.

      You do have to worry about security issues, and the occasional deprecation of an API, but there is no reason why a web-based service should need to be rewritten just to keep working. Is that true for mobile and desktop apps?

      • bluGill 2 days ago ago

        If your webapp is simple a rewrite is no big deal and often cheaper than updating the old. As your project gets large that is no longer true. I work with embedded systems, when everything was small ( 8 bits isn't enough for anything else - new feature often means removing something else) we of rewrote large parts to get one new feature. it was easy to estimate a new project and we came in on time. As projects get bigger (32 and 64 bits are now available) we can't do that we can't afford a billion dollar project to rewrite every year.

      • pixl97 2 days ago ago

        >but there is no reason why a web-based service should need to be rewritten just to keep working

        I mean most webapps of any size are built on underlying libraries, and sometimes those libraries disappear requiring a significant amount of effort to port to a new library.

    • eru 2 days ago ago

      Matlab, at least the language, has changed quite a bit when I was using it. Eg they added being able to pass functions as first class values.

      But that was more than a decade ago, I guess?

      • kqr 2 days ago ago

        Languages often add features in ways that does not break existing code. I think here we are discussing changes that are backwards-incompatible.

    • domga 2 days ago ago

      Not really, web pages (which do not depend on specific APIs) still work just fine, they just look antiquated.

      Desktop apps in theory can run too, but it depends on what they link and if OS still provides it.

      • 1313ed01 2 days ago ago

        My current hobby project is for DOS. Runs everywhere, mostly thanks to DOSBox, and the API has not changed since 1994 and will never change. For something to run offline and to avoid being stuck forever maintaining code I think this is what I will stick to for most of my future projects as well.

        It's not like any modern OS, or popular libraries/frameworks could not provide an equally stable (subset of an) API for apps, but sadly they don't.

        • kqr 2 days ago ago

          I actually think this is rather sensible. If I didn't have to learn DOS from scratch[1] to do it, I would be tempted to use it as a platform for many of my side projects. As you say, it will forever be a desirable platform to maintain (currently in the shape of DOSBox) thanks to the vast quantities of software – especially games – written for it.

          [1]: I am young enough that when I last used DOS was before I had started programming, so I never learned it beyond command-line interaction.

  • cjfd 2 days ago ago

    I don't think anyone wants warts. It is a fact that every technology has them, though. And also that every attempt to get rid of warts seems to introduce new, perhaps yet unknown warts. There is a kind of law of conservation of misery here. I think I now have said in a somewhat more clear way what the article is trying to say and without a title that is actually false.

    • imiric 2 days ago ago

      I think you're conflating warts with bugs.

      The point of the article is that warts are subjective. What one person considers unwanted behavior, another might see it as a feature. They are a testament to the software's flexibility, and to maintainers caring about backwards compatibility, which, in turn, is a sign that users can rely on it for a long time.

      Nobody wants bugs. But I agree with the article's premise.

    • eucyclos 2 days ago ago

      This makes me think of Alexander's pattern language- according to that framework one of the key components of beauty is texture. I wonder if our dislike of untextured experiences stem from an unease at not knowing where the thing is hiding its warts.

      • cjfd 2 days ago ago

        That could be true. What it makes me think about is the danger of things that magically just work. Things that magically just work also may stop working for some reason and then you do not have an idea how to fix them.

  • capestart 2 days ago ago

    It’s easy to forget that warts are often just features that prevent breaking things down the line. SQLite may not be perfect, but it’s been around forever for a reason. Definitely prefer a solid, slightly imperfect tech stack over something that could change with every update.

  • imiric 2 days ago ago

    I agree with the article's premise, but would add one thing: warts might be a sign that the software is too flexible, and/or is doing too many things. For programming languages and databases this, of course, doesn't apply, since they're designed to be flexible. But simple tools that follow the Unix philosophy rarely have warts. I've never experienced warts with Unix tools such as `cat`, `cut`, `uniq`, etc., which I believe is primarily because they do a single thing, and their interface is naturally limited to that task. Maybe there were some growing pains when these tools were new, but now their interfaces are practically cemented, and they're the most reliable tools in my toolbelt. More advanced tools such as `grep`, `sed`, and particularly `awk`, are a bit different, since their use cases are much broader, they support regex, and `awk` is a programming language.

    This is why my first instinct when automating a task is to write a shell script. Yes, shells like Bash are chock-full of warts, but if you familiarize yourself with them and are able to navigate around them, a shell script can be the most reliable long-term solution. Even cross-platform if you stick to POSIX. I wouldn't rely on it for anything sophisticated, of course, but I have shell scripts written years ago that work just as well today, and I reckon will work for many years to come. I can't say that about most programming languages.

    • jodrellblank 2 days ago ago

      > "I've never experienced warts with Unix tools such as `cat`, `cut`, `uniq`, etc."

      Literally the first time of using uniq and finding duplicates in the output is a wart. Having to realise it's not doing "unique values" it's doing "drop consecutive duplicates" and the reason it's doing it is because 1970s computers didn't have enough memory to dedupe a whole dataset and could only do it streaming.

      cat being "catenate" intended to catenate multiple files together into one pipeline stream, but always being used to read single files instead of a pager. "cat words.txt | less" is a wart seen all over the place.

      cut being needed to separate fields because we couldn't all come together and agree to use ASCII Field Separator character to separate fields is arguably warty.

      Or the find command where you can find files by different sizes and you can specify +100G for finding files with sizes 100 Gibibytes or larger, or M for Mebibytes, K for Kibibytes, and b for bytes. No just kidding b is 512-byte blocks, it's nothing at all for bytes just +100. No just kidding again that's also 512-byte blocks. It's c for bytes. Because the first version of Unix happened to use 512-byte blocks on its filesystem[1]. And the c stands for characters so you have to know that it's all pre-Unicode and using 8-bit characters.

      [1] https://unix.stackexchange.com/questions/259208/purpose-of-f...

      • imiric 2 days ago ago

        I think you're misunderstanding what a "wart" is. None of those examples are warts. They're the way those programs work, which you can easily understand by reading their documentation. Your point about `cat` is an example of misuse, and `cut` is needed because of the deliberate design decision to make text streams the universal interface, which has far greater benefits than forcing any specific separator.

        A wart is not surprising or unintuitive behavior. It is a design quirk—a feature that sticks out from the program's overall design, usually meant to handle edge cases, or to preserve backwards compatibility. For example, some GNU programs like `grep` and `find` support a `POSIXLY_CORRECT` env var, which changes their interface and behavior.

        My point is that the complexity of the software and its interface is directly proportional to the likelihood of it developing "warts". `find` is a much more complex program than `cut` or `uniq`, therefore it has developed warts.

        Next time, please address the point of the comment you're replying to, instead of forcing some counterargument that happens to be wrong.

        • jodrellblank 2 days ago ago

          The context of the blog post is that "warts" are the way something was originally designed or built, which is now outdated and you would change it if you could, but you don't want to break backwards compatibility so you don't.

          > "For example, some GNU programs like `grep` and `find` support a `POSIXLY_CORRECT` env var, which changes their interface and behavior."

          By your own logic that's not a wart, that's the way the programs work and they are documented to work like that. Twisting yourself into knots to pretend Linux/Unix utilities are perfectly designed isn't good logic. There's no excuse "it was designed that way" or "it's documented to work like this" which makes something not warty. Something might have been a good design originally and become a bad design as the world changed around it.

          For example unserialised byte streams where you have to fight whether \n or 0 (null byte) are data or record separators, is a wart. How do we know? Because when people design newer tools they don't keep that, they design it away. Serialised JSON and jq, .NET objects in PowerShell, structured data in nushell[1] and oilshell YSH[2].

          [1] https://www.nushell.sh/

          [2] https://oils.pub/ysh.html

          • imiric 2 days ago ago

            Ugh, you're exhausting.

            Look, here's a program:

              print "Hello"
            
            Do you see any warts in it? No? Well, my point is that the likelihood of a program developing warts is partly related to its complexity. That's it.

            I mentioned Unix tools as an example of this, since they're by far the most reliable tools I use, particularly the simpler ones. I never said that they were "perfectly designed". Don't put words in my mouth.

            > By your own logic that's not a wart, that's the way the programs work and they are documented to work like that.

            A wart can be documented. I mentioned the case of `POSIXLY_CORRECT` since GNU tools predate the POSIX standard by a few years, so presumably that env var and the change in behavior was introduced when POSIX was first established. I'm not familiar with the detailed history, but that example is beside my point.

            > Something might have been a good design originally and become a bad design as the world changed around it.

            Again, you're misunderstanding what a wart is. It has nothing to do with bad design, and it doesn't appear on its own over time. It is a deliberate quirk that deviates from the program's original design—whatever that may be—in order to support some functionality without breaking backwards compatibility.

            Just because you don't like the design doesn't mean that the software has warts.

            Read that again.

            Those fancy shells you mention have at least one major drawback: interop between tools only works if structured I/O is supported by each tool, whether that is because they were written from scratch, or because it was tacked on via shims. In practice, this can only be done if the entire ecosystem is maintained by a single organization. And if that strict interface needs to be changed, good luck updating every single tool while maintaining backwards compatibility. Talk about warts...

            In contrast, using unstructured text streams as the universal interface means that programs written 40 years ago can interoperate with programs written today, without either author having to add explicit support for it, and without requiring centralized maintenance. Most well-designed programs do support flags like `-0` to indicate a null separator, which has become somewhat of a standard, but otherwise the user is in control of how that interop happens. This freedom is a good thing, and is precisely what makes programs infinitely composable.

            In any case, this discussion has veered off from my original point and the article, because you don't like how Unix programs are designed, and misunderstand what warts are. So for that reason, I'm out.

            • jodrellblank 18 hours ago ago

              > "Look, here's a program: print "Hello" Do you see any warts in it? No?"

              Yes. It's Python 2's print statement, in PythonWiki's PythonWarts[2] page. And it counters a lot of your claims:

              - It was right there from the beginning, it wasn't designed in later.

              - It may have been good design at the time when Python was a teaching language, and did appear on its own over time as the world and Python's use cases changed around it.

              - It wasn't added as a workaround to keep backward compatibility.

              - It is a single line of code, as small as you could reasonably get, and still warty.

              > "A wart can be documented. [A wart] is a deliberate quirk that deviates from the program's original design"

              It was your point that "x is not a wart because it's documented to work that way". How does the article's "Haskell .. built in String type is bad data structure for storing text" 'deliberately deviate' from Haskell's original design? How does sqlite "tables being flexibly typed by default" deliberately deviate from SQLite's original design? On that PythonWarts page "Explicit self in methods" is Python's original design. It's you who has come up with some personal defition of wart that nobody else is using.

              > "I mentioned Unix tools as an example of this, since they're by far the most reliable tools I use"

              That's orthogonal to their wartiness. And incidentally that should make you suspicious. By the blog post's reasoning, tools are far more likely to be (warty + reliable) than (wart-free + reliable) since the latter relies on them being designed perfectly and not changed and in an environment which isn't changing or to have a design that lets them be changed very flexibly. All those are unlikely and difficult, whereas the former (warty+reliable) can apply to anything, and probably does.

              > "Just because you don't like the design doesn't mean that the software has warts."

              And just because you like the design doesn't it's wartless.

              > "In contrast, using unstructured text streams as the universal interface means that programs written 40 years ago can interoperate with programs written today, without either author having to add explicit support for it"

              Unix/Linux shell streams are not unstructured text streams. That's why tools like `cut` exist, because the text has structure. And null bytes can be used as record separators because the stream is not text. And it's why "interop between tools only works if structured I/O is supported by each tool" applies - each tool can only read the byte streams that are within the ad-hoc informally specified half of a common serialization standard. It's not that I don't like this, it's that I don't like people spreading Linux propaganda about how this is wartless and brilliant when it's barely good enough even for tasks of the 1970s and should be in a museum.

              [1] https://wiki.python.org/moin/PythonWarts*

  • yxhuvud 2 days ago ago

    Keeping warts around is also how you end up with systems and frameworks noone wants to touch without a pole.

  • mapontosevenths 2 days ago ago

    The entire premise, that our goal is to create the software equivalent of 100+ year old bridges, is a bit flawed. We aren't building a historical legacy here. Our crummy web-apps are not the great pyramids, and should not be built like them. Nobody likes to admit it, but 99% of what we build today is disposable, and it should be built cheaply and quickly.

    I see this attitude, especially with juniors, and often with project managers, that we need perfection and are building things that are meant to last for decades. Almost nothing does though. Many/most business applications are obsolete within 5 years and it costs more to cling to the fiction that what we're doing is important and lasting.

  • grebc 2 days ago ago

    I think when you’re young & doe eyed what is shiny & new is exciting and your wants are easily confused with needs.

    You don’t need the shiniest & newest framework to tell a computer to generate some HTML & CSS with a database and some logic. And don’t have the lived experience of building & shipping to realise that only 1 in 1,000,000 projects probably ever get more than 100 sets of eyeballs so end up using much more complicated tools than necessary.

    But all the news & socials will sing praise to the latest shiny tools x.x.9 release so those needs easily get confused.

    • ehnto 2 days ago ago

      I more or less agree, if you are a small company with a small real world userbase, you don't need hyperscaling tech, and thus hyperscaling problems.

      I think some companies oversubscribe to reliability technology too. You should assess if you really need 99.9999% uptime before building out a complex cloud infrastructure setup. It's very likely you can get away with one or two VMs.

      As I understand, HN runs on a single server with a backup server for failover.

    • pixl97 2 days ago ago

      You may not need the shiniest new framework, but you do have to ensure your old framework has not been abandoned or is no longer getting security updates.

  • adityaathalye 2 days ago ago

    So much this. Warts = (software longevity) life lessons. (Though it must be noted that warts != bad design.)

    Some years ago, I gave a talk on functional-style shell programming which began with this:

      Prelude: Why even bother?
    
      Personally…
    
        Text is everywhere, better learn to process it
        The masters already solved it better
        Stable tools, well-known warts
        Personal/team productivity, tool-building autonomy
        To learn about
            good and successful design
            bad and successful design
            computer history
        For the fun of it
    
    
    ref:

    Dr. Strangepipes Or: How I Learned To Stop Worrying && Function In Shell (Functional Conf 2019)

    org-mode slideware: https://gist.github.com/adityaathalye/93aba2352a5e24d31ecbca...

    live demo: https://www.youtube.com/watch?v=kQNATXxWXsA&list=PLG4-zNACPC...

  • socalgal2 2 days ago ago

    long lasting warts might be evidence that the software is backward compatible. It might also be evidence the software is full of footguns and ways to screw yourself.

    I'm sure it's an unpopular opinion but sh/bash scripts suck. There are magic incantations all over and if you get one wrong then you've got code injection. We can't go back and fix it but we could either replace it, or update in some way so it it's easy to be safe and only one way to do things that "does the right and safe thing" always.

    I don't think keeping unsafe by default is a good model and I think all of the daily headlines of people/companies/hospitals/airports/goverments being hacked in large part because we keep the warts.

    • drdo 2 days ago ago

      I don't think it's unpopular at all that bash scripts suck a lot.

    • grebc 2 days ago ago

      I don’t know how you take the article OP wrote/posted and conflate it with security breaches.

      Long lived shipping code is typically not aesthetically pleasing.

      • aloha2436 2 days ago ago

        Some of these warts are problems with security, or data integrity, or resource usage, etc.

        It's a difference of opinion more than it is a conflation with something else.

        My _personal_ preference is that software does the "correct" thing by default even if it breaks my build or my tests or even running software; I would rather it break visibly than work nefariously.

  • m4rc3lv 2 days ago ago

    I think I would use PHP for the backend instead of Node or Perl.

  • snowwrestler 2 days ago ago

    Is there some version of Betteridge’s Law of Headline Questions but for the imperative voice?

    But seriously I don’t think warts are the key to long-serving web applications without maintenance. The key is to pick technologies with no security vulnerabilities. That way you won’t ever have to install patches (which risk breaking functionality, which requires maintenance to fix), and you won’t ever get hacked (which also requires maintenance to fix).

    Do such technologies exist? I don’t think so. Which is why I think the concept of constant maintenance is inherent to web applications and everyone should just accept that.

    I think we’re far enough along in this journey to recognize that the dream of “immortal frictionless machinery” is not what software actually is. We create the friction (via usage, attacks, and patches) and we should plan to address it.

    Bridges do receive regular maintenance BTW, even very well-engineered ones…

    • 1313ed01 2 days ago ago

      No, ideally you pick technologies that still have maintenance and security updates, but no other changes. Assuming it already does what you need it for already, and that is usually the case. Without other changes there also will be no new security issues to fix.

      • snowwrestler 2 days ago ago

        It’s just impossible to exclude the possibility of breaking changes because it’s impossible to exclude unforeseen bugs.

        Operationally it is much better to accept the need for software maintenance and plan for it.

      • pixl97 2 days ago ago

        >, ideally you pick technologies that still have maintenance and security updates

        These are exceedingly rare, and even more rarely do those stable technologies cover your actual needs.

  • keithnz 2 days ago ago

    Thing is, with AI, this is much less of an issue, as long as AI can maintain the code it would be able to transition it through tech change. This seems a better bet than trying to choose things that will remain static.

    • ehnto 2 days ago ago

      How is that any different than having a developer work on it, which also doesn't happen? It's not a money thing, huge companies with dev teams fail at this too.

      I don't disagree that things probably won't stay static though, but you can do a lot to minimise the surface area of change you are exposed to.