The Subjective Charms of Objective-C

(wired.com)

10 points | by wmlive a day ago ago

4 comments

  • markus_zhang a day ago ago

    Never used Objective-C or have done any serious programming (I work as a DE which mostly do data modelling, so I don't consider myself a serious programmer), but I feel the same as the author.

    The work has bogged down whatever interest I have in programming, and the only sane solution is to somehow magically remove all financial burdens, go into a cabin in a mountain, and program my own projects and read some science, preferably with a dog and a fire.

  • vaxman a day ago ago

    I had read a book about Objective C in the 1980s and towards the end, was in with the Sybase crowd, so I did see NeXTStep (which had adopted it) back in its prime and instantly realized the potential --NeXTStep let "business analysts" visually create graphical database apps using pre-manufactured components written in Objective C (more than this really, there is also the concept of Responders which are like in-line services that can be arranged to create reusable value).

    The problem Steve had with that nice ObjectiveC system was that the fools who ran Corporate America were from the generation that was still "shell shocked" from all of the vendor lock-in that went on during the wars between DEC, IBM, HP, Spurrows (Unisys) and smaller players like Nixdorf, none of which were software (or hardware) compatible with each other, meaning their customers were held hostage, often with incredibly expensive long term contracts on less than state-of-the-art machines. In the new Desktop era (that had just begun), for a short while at least, they wanted "cross-platform apps" that ran on universal hardware (think PC and Mac "clones") and that meant using object oriented frameworks and the only OOP systems that were mature enough to have platform-specific GUI libraries for DOS, Windows and Mac were some interesting Smalltalk packages and some exotic C-macro based systems like Neuron Data's Open Interface Toolkit and eventually Microsoft MFC (which was available for Classic Mac and all the versions of Microsoft Windows). Of course, as Windows took over the game, the need for cross-platform apps ended --just as Visual Studio, Microsoft FoxPro/Access and Visual BASIC were cleaning up and really locking everyone in for the decade. There was no more need for object oriented systems like NeXTStep or Smalltalk.

    But then the WWW became a thing and NeXT made a bold move with WebObjects, which allowed their ObjectiveC visual tools (Project Builder) to output HTML in realtime. About the same time, Sun Microsystems launched Java with a really terrible UX library, but the promise of the portable Java interpreter (which was similar to the promise of UCSD p-code back in Steve's earlier Apple days) and that meant Java could "run (ugly looking but portable) code" in web browsers on any hardware. Oh happy days, a way to get back into Corporate America without the word Microsoft on your business cards. While the FIRST end of ObjectiveC is described in the article (before NeXT picked it up), Steve saw the potential to replace that ugly Java UX library with their WebObjects masterpiece and pivoted to rewrite WebObjects to output Java --and that was the SECOND end of ObjectiveC.

    Somehow my Dad's old colleague working with Rear Admiral Grace Hopper on behalf of CalPERS was able to bail the shareholders of NeXT and Apple out. But NeXT was all about WebObjects by then and you couldn't run Macs with WebObjects (that would be a "thin client" which was a much maligned concept and offered no value-add for Apple), so that meant Avie Tevanian's team had to hold their nose and fuse the stinking classic Mac operating system into NeXTStep, breathing new life into ObjectiveC.

    As that post-NeXT Mac operating system was "forked" to make battery powered phones, the idea of allowing developers to write apps in ObjectiveC became a liability. Suddenly they had a class of "fart apps" that were draining batteries, closing unexpectedly, heating up devices in people's hands, etc. --and to a casual user with limited computer experience, that looked like an iPhone problem. Apple had the incredible app-review process going on, but it's not the correct prescription for curing stupid programming. They needed a solution like Java (being used by their competitor and embroiled in litigation involving Sun/Oracle/Microsoft/Google) or even their old UCSD p-Code interpreter that ran Pascal, a language that had long been obsoleted by Ada, which everyone with half a brain hates. So..

    > The end came for Objective-C in June of 2014, when Apple announced the debut of Swift

    I wouldn't say that the THIRD death of Objective-C happened in 2014 nor is it upon us in 2025. Aside from a lot of existing code and performance reasons to use Objective-C, there is also the fact that the popular AI coder models were all trained on GitHub 2023 and the Swift code that's out there is from six different versions of the language, mostly written by people who were just learning it (so it's not leveraging Swift's value prop much). Cross-platform Linux/Mac code, like Moonlight for iOS, is also written in Objective-C (that had superior C++ integration than Swift until very recently). It's possible to remove ObjectiveC from Apple's product line, but it can't be a priority given the industry move to agentic-apps.

  • mistrial9 a day ago ago

    > Objective-C came up in the frenzied early days of the object-oriented programming era, and by all accounts, it should have never survived past it.

    this is deeply uninformed, with bald prejudice added.

    • bediger4000 a day ago ago

      I agree. I had a couple of NeXT slabs from 1991 to 2000 or so. Objective C had some benefits. The NeXT Obj-C libraries were very usable and well thought out. I will grant that it had problems, the biggest of which was that it wasn't Windows 3.11 and it wasn't backed by Microsoft. The amount of pro-Microsoft press and propaganda was astonishing during that period.