Similarly, Margo Selby crafted a very large, vibrant 16m textile installation titled ‘moon landing’ based on the work of Navajo women who wove the integrated computer circuits and memory cores that enabled the 1969 moon landing. Until recently it was on display at Canterbury Cathedral. It is accompanied by a musical composition for strings by Helen Caddick.
What fascinating glimpse at a part of history that I had no clue about. The main reason Navajo (and other nations) native Americans figured in my 'history of the world' so far was the WWII era communications saga.
Back in the 1980s2H there was a brief fashion trend of woollen knit sweaters with IC mask type patterns. Guessing related to designers playing around with design software and knitting tech made possible by microprocessor revolution.
Early core memories were woven by hand, but IBM rapidly automated the process. (Since most computers from the 1950s to early 1970s used core memory, there was a lot of demand.) However, IBM later found that it was cheaper to have the memories assembled by hand in Asia. For detailed information on core memory, see the book "Memories That Shaped an Industry".
Core rope is different from core memory and much rarer. Core rope is essentially ROM, using much larger cores with wires going through or around a core, storing 192 bits per core. Core ropes were hand-woven (with machine guidance) for the Apollo Guidance Computer.
December/January 1987 I was doing a vacation EE internship in a power station in Australia. Some of the Hitachi mini computers still used core RAM. This was in an all Hitachi Heavy Industries turnkey coal-fired power station commissioned ca. 1985. Pretty sure they had a reference design from boilers and turbines right down to the hardware and software level and kind of cookie cutter stamped out power stations from it. The Hitachi engineering attitude was obviously "If it works, keep doing it the same way for as long as possible". I was told that for some software (firmware?) updates, they'd simply ship out a new core RAM module -- It's non-volatile after all.
Marilou Schultz asked me to suggest some chips that would make good weavings, and I suggested the 555, among other chips. She also had questions about the different colors and textures in the chip. She notices a lot more about the colors than I do; I look at a chip in terms of functionality and connectivity and don't pay attention to the colors.
I haven't seen the weaving in person; I think the colors don't come through clearly in the photo. From talking with the weaver, the lavender is apparently in the metal regions.
I went into a lot more of the Navajo history in my previous article [1] so I didn't repeat it in the new article. The quick summary is that the Navajo suffered a century of oppression, were forced off their land in the Long Walk, and had their sheep slaughtered in the 1930s in the Navajo Livestock Reduction. In the 1960s, the Navajo had 65% unemployment, $300 per capita income, and lacked basic infrastructure. Various groups looked to industrialization as a solution, so Fairchild opened an IC manufacturing facility on Navajo land in 1965, employing 1200 Navajo workers and becoming the nation's largest non-government employer of American Indians. The plant was generally considered a success, but in 1975, Fairchild had business problems and laid off 140 Navajo employees. Things went downhill and a radical group, AIM (American Indian Movement), took over the plant with rifles. The armed occupation ended peacefully after a week, but Fairchild closed the plant and moved production to Asia.
Just to be clear since "oppression" is a very broad term: the Navajo (and most other Native American tribes) are victims of genocide. It was a far, far, far more systematic destruction effort than mere marginalization.
Children were stolen, forbidden from learning their native language, killed en masse, food supplies were destroyed, land was continuously taken from them the second anything valuable was discovered on it, etc. etc.
It's really horrific stuff and the effects are still extremely clear on the reservations today.
Delightful crossover: silicon layout turned into textile logic. The 555 is perfect for this—bold pinout, big blocks (comparators + RS latch), and routing that reads from a distance. Add a tiny legend and it’s a great teaching piece.
It's really fortunate that the history of the 555 timer is really well documented. Its inventor, Hans Camenzind, wrote several books, and even had a Youtube channel in his later years[1]. It's a shame that so many iconic chips that have changed the world aren't so well documented. I went down a real rabbithole a while ago trying to find in-depth information about the Hitachi HD44780. I couldn't even decisively pin down exactly what year it was first manufactured. It's interesting to think of microchip designs as a kind of artistic legacy: Chips like the 555 have had an enormous impact on modern history.
That 1981 document says "preliminary", which suggests very limited trial production. Even the 1985 reference I found is "advance". First tape-out may have been earlier than that. Have you tried asking Hitachi (apparently now Renesas) about it? It's more likely that this information is available on the Japanese part of the internet.
The continued popularity of this chip confuses me. I don't understand why it didn't get forgotten decades ago as microcontrollers became common place. Though compared to the Pentium talking on older designs is likely faster to make, so I wonder if he markets himself to an older audience who is nostalgic for these ancient chips.
You may be right about nostalgic reasons, but as a freshman during the emergence of microcontrollers, I've asked the same question to and old professor, in the sense of "why discrete digital electronics is still widely used?".
His response still resonates with me today: a military grade 555 would work in extreme conditions (e.g. heat), would last pretty much forever, would consume virtually no power, and will still cost you a penny.
Sometimes that's exactly what you need. Reliability, durability and cost trumps the power of programmability.
The original bipolar variant of NE555 is likely to have a lifetime of many decades, if not more than a hundred years, even when operated continuously in harsh environments.
A modern CMOS microcontroller has a much more limited lifetime. Depending on model, you can hope for 10 years or 20 years, but not much more than that because very small MOS transistors and flash memory cells eventually die, unlike the more robust bipolar ICs (whose active regions are buried in the semiconductor crystal, not located at its surface, like in MOS devices).
The world would be a much sadder, drearier place without the 555. That's the nostalgia part out of the way.
Really it's such a useful almost universal lego block of a component that it's hard to imagine it going away anytime soon. Sure microcontrollers are as cheap as chips these days, but you get a lot more with them. Do I need to say that sometimes more is less? Can think of scenarios where you absolutely don't want to see a chip containing firmware/code which needs auditing and locking down.
Well even if we assume there's a suitable 8-pin microcontroller which doesn't cost more than the 555, merely loading the firmware onto the microcontroller is going to add significant cost and complexity to the manufacturing stage. Also the microcontroller would be far more sensitive to power supply inadequacies because its state consists of much more than a capacitor and a flipflop.
Back in the day you'd go into an electronics store and there'd be books containing just 555 circuit recipes. Not to mention the magazine articles.
And every EE student back when we tied onions to our belts must have had a lab assignment to spec out a PLL using 555 and bits and bobs and then measure transient responses, temperature stability, etc.
Thanks for sharing.
Similarly, Margo Selby crafted a very large, vibrant 16m textile installation titled ‘moon landing’ based on the work of Navajo women who wove the integrated computer circuits and memory cores that enabled the 1969 moon landing. Until recently it was on display at Canterbury Cathedral. It is accompanied by a musical composition for strings by Helen Caddick.
https://www.margoselby.com/pages/moon-landing
What fascinating glimpse at a part of history that I had no clue about. The main reason Navajo (and other nations) native Americans figured in my 'history of the world' so far was the WWII era communications saga.
https://www.nationalww2museum.org/war/articles/american-indi...
Back in the 1980s2H there was a brief fashion trend of woollen knit sweaters with IC mask type patterns. Guessing related to designers playing around with design software and knitting tech made possible by microprocessor revolution.
We’ve come full circle - knitting tech was the basis for early computing machines!
Jacquard loom!
Yes, and early core memory was also woven by hand. I am not sure if this was just for core rope memory, or if it was more widespread than that.
Early core memories were woven by hand, but IBM rapidly automated the process. (Since most computers from the 1950s to early 1970s used core memory, there was a lot of demand.) However, IBM later found that it was cheaper to have the memories assembled by hand in Asia. For detailed information on core memory, see the book "Memories That Shaped an Industry".
Core rope is different from core memory and much rarer. Core rope is essentially ROM, using much larger cores with wires going through or around a core, storing 192 bits per core. Core ropes were hand-woven (with machine guidance) for the Apollo Guidance Computer.
December/January 1987 I was doing a vacation EE internship in a power station in Australia. Some of the Hitachi mini computers still used core RAM. This was in an all Hitachi Heavy Industries turnkey coal-fired power station commissioned ca. 1985. Pretty sure they had a reference design from boilers and turbines right down to the hardware and software level and kind of cookie cutter stamped out power stations from it. The Hitachi engineering attitude was obviously "If it works, keep doing it the same way for as long as possible". I was told that for some software (firmware?) updates, they'd simply ship out a new core RAM module -- It's non-volatile after all.
This is how we pass our chip designs to our descendents so they may rebuild civlization.
Reminds me of "A Canticle for Leibowitz".
https://en.wikipedia.org/wiki/A_Canticle_for_Leibowitz
Needs to be stone monoliths!
A very inaccurately timed civilization.
Thats on you for not consulting the tolerance band on your resistors.
I'm reminded of this: https://www.electronicdesign.com/technologies/analog/article...
Imagining an ancient theological order with ranks based on color coding...
that could easily pervert to racial castes, but get the cargo-cult vibe, of trying to induce the "benign ones" to come back
Author here if anyone has questions...
Yes, my question is: did the weaver have any questions?
Marilou Schultz asked me to suggest some chips that would make good weavings, and I suggested the 555, among other chips. She also had questions about the different colors and textures in the chip. She notices a lot more about the colors than I do; I look at a chip in terms of functionality and connectivity and don't pay attention to the colors.
Speaking of colors, you mentioned the significance of the purple/lavender in the weaving. But I don't see any in the pictures! What am I missing?
I haven't seen the weaving in person; I think the colors don't come through clearly in the photo. From talking with the weaver, the lavender is apparently in the metal regions.
Cost for a piece like this? It's striking!
I don't know the cost of her weavings. They are very time-consuming to create, so I hope she charges a good price.
This piece would be very enticing for a tech billionaire.
Any question I have starts with "tell me a lot about the Navajo people"... so no questions for here. Just want to say: good article.
I went into a lot more of the Navajo history in my previous article [1] so I didn't repeat it in the new article. The quick summary is that the Navajo suffered a century of oppression, were forced off their land in the Long Walk, and had their sheep slaughtered in the 1930s in the Navajo Livestock Reduction. In the 1960s, the Navajo had 65% unemployment, $300 per capita income, and lacked basic infrastructure. Various groups looked to industrialization as a solution, so Fairchild opened an IC manufacturing facility on Navajo land in 1965, employing 1200 Navajo workers and becoming the nation's largest non-government employer of American Indians. The plant was generally considered a success, but in 1975, Fairchild had business problems and laid off 140 Navajo employees. Things went downhill and a radical group, AIM (American Indian Movement), took over the plant with rifles. The armed occupation ended peacefully after a week, but Fairchild closed the plant and moved production to Asia.
[1] https://www.righto.com/2024/08/pentium-navajo-fairchild-ship...
Just to be clear since "oppression" is a very broad term: the Navajo (and most other Native American tribes) are victims of genocide. It was a far, far, far more systematic destruction effort than mere marginalization.
Children were stolen, forbidden from learning their native language, killed en masse, food supplies were destroyed, land was continuously taken from them the second anything valuable was discovered on it, etc. etc.
It's really horrific stuff and the effects are still extremely clear on the reservations today.
Oh wow, thanks for the info!
between navajo, and the northernmost south american migrations [i.e. Aztec]
it appears northam was colonized thousands of years before anybody else even knew, let alone cared for it.
Fascinating related article about a weaving of a Pentium (linked near the bottom): https://www.righto.com/2024/08/pentium-navajo-fairchild-ship..., discussion a year ago: https://news.ycombinator.com/item?id=41418301 (84 comments)
Delightful crossover: silicon layout turned into textile logic. The 555 is perfect for this—bold pinout, big blocks (comparators + RS latch), and routing that reads from a distance. Add a tiny legend and it’s a great teaching piece.
I saw her work at MoMA, loved it. She's 70? That's even more awesome.
That's a pretty darn cool looking thing.
Funny how, guided by pure mechanical necessity, pretty stuff can arise.
I've always thought that clockwork, chips and other machines were pretty.
And fractals. ( https://fleen.org/i40.png ) And plants and animals too. And weathered rock.
Which leads me to consider what isn't pretty. Naivety?
This is beautiful. Thank you, Ken, and thank you, Marilou, for sharing :)
Reminded me of this “Navajo weaver inspired by video games”: https://www.youtube.com/watch?v=TrDZIyYSMfI
The 555 timer is iconic. Just iconic. I wonder how many billions of them have been shipped over the years?
It's really fortunate that the history of the 555 timer is really well documented. Its inventor, Hans Camenzind, wrote several books, and even had a Youtube channel in his later years[1]. It's a shame that so many iconic chips that have changed the world aren't so well documented. I went down a real rabbithole a while ago trying to find in-depth information about the Hitachi HD44780. I couldn't even decisively pin down exactly what year it was first manufactured. It's interesting to think of microchip designs as a kind of artistic legacy: Chips like the 555 have had an enormous impact on modern history.
1: https://www.youtube.com/@hcamen
I couldn't even decisively pin down exactly what year it was first manufactured
Likely 1985.
https://www.crystalfontz.com/blog/look-back-tech-history-hd4...
I did see this article when I was researching, but it's incorrect. You can find references to the HD44780 in earlier catalogs. The earliest reference I can find is 1981: https://archive.org/details/Hitachi-DotMarixLiquidCrystalDis...
It's also referenced in this catalog from 1982: https://bitsavers.org/components/hitachi/_dataBooks/1982_Hit...
Likely the first year of manufacture was 1981/82.
That 1981 document says "preliminary", which suggests very limited trial production. Even the 1985 reference I found is "advance". First tape-out may have been earlier than that. Have you tried asking Hitachi (apparently now Renesas) about it? It's more likely that this information is available on the Japanese part of the internet.
Try 1972; taped out in 1971.
https://en.wikipedia.org/wiki/555_timer_IC
Saw an exhibit with some of her work, I think in Albuquerque. Was surprised/delighted to see weavings of circuits.
This is so cool. So if they used twists of steel wires or similar as string for the white parts, they could have a functional circuit.
They'd still need the electrical components, such as the transistors and passive components
Very cool art piece!
alan dean foster's "cyber way" is a somewhat thematic sf novel
The continued popularity of this chip confuses me. I don't understand why it didn't get forgotten decades ago as microcontrollers became common place. Though compared to the Pentium talking on older designs is likely faster to make, so I wonder if he markets himself to an older audience who is nostalgic for these ancient chips.
You may be right about nostalgic reasons, but as a freshman during the emergence of microcontrollers, I've asked the same question to and old professor, in the sense of "why discrete digital electronics is still widely used?".
His response still resonates with me today: a military grade 555 would work in extreme conditions (e.g. heat), would last pretty much forever, would consume virtually no power, and will still cost you a penny.
Sometimes that's exactly what you need. Reliability, durability and cost trumps the power of programmability.
The original bipolar variant of NE555 is likely to have a lifetime of many decades, if not more than a hundred years, even when operated continuously in harsh environments.
A modern CMOS microcontroller has a much more limited lifetime. Depending on model, you can hope for 10 years or 20 years, but not much more than that because very small MOS transistors and flash memory cells eventually die, unlike the more robust bipolar ICs (whose active regions are buried in the semiconductor crystal, not located at its surface, like in MOS devices).
The world would be a much sadder, drearier place without the 555. That's the nostalgia part out of the way.
Really it's such a useful almost universal lego block of a component that it's hard to imagine it going away anytime soon. Sure microcontrollers are as cheap as chips these days, but you get a lot more with them. Do I need to say that sometimes more is less? Can think of scenarios where you absolutely don't want to see a chip containing firmware/code which needs auditing and locking down.
Well even if we assume there's a suitable 8-pin microcontroller which doesn't cost more than the 555, merely loading the firmware onto the microcontroller is going to add significant cost and complexity to the manufacturing stage. Also the microcontroller would be far more sensitive to power supply inadequacies because its state consists of much more than a capacitor and a flipflop.
Because it's fun and there are many readily available DIY designs that use it.
Back in the day you'd go into an electronics store and there'd be books containing just 555 circuit recipes. Not to mention the magazine articles.
And every EE student back when we tied onions to our belts must have had a lab assignment to spec out a PLL using 555 and bits and bobs and then measure transient responses, temperature stability, etc.
be cool if the creator used semi-conducting threads and it actually worked
This is the kind of thing that can start conspiracy theories of time travelers :)
My immediate thought was "Nazca lines of a 555? Yes".
I have a Displate of a 555 in my little maker corner someone gifted me once: https://eikehein.com/assets/images/makercorner.jpg
I demand to know what that construction of aluminium extrusions is. They're my pet material.
A spider bot with the legs folded up :)