I still cannot believe NASA managed to re-establish a connection with Voyager 1.
That scene from The Martian where JPL had a hardware copy of Pathfinder on Earth? That’s not apocryphal. NASA keeps a lot of engineering models around for a variety of purposes including this sort of hardware troubleshooting.
It’s a practice they started after Voyager. They shot that patch off into space based off of old documentation, blueprints, and internal memos.
Imagine scrolling back in the Slack chat 50 years to find that one thing someone said about how the chip bypass worked.
Imagine any internet company lasting 50 years.
This is why slack is bullshit. And discord. We should all go back to email. It can be stored and archived and organized and get off my lawn.
It’s not Slack’s fault. It is a good platform for one-off messages. Need a useless bureaucratic form signed? Slack. Need your boss to okay the afternoon off? Slack. Need to ask your lead programmer which data structure you should use and why they’re set up that way? Sounds like the answer should be put in a wiki page, not slack.
All workflows are small components of a larger workplace. Emails also suck for a lot of things. They probably wouldn’t have worked in this case, memos are the logical upgrade from emails where you want to make sure everyone receives it and the topic is not up for further discussion.
memos are the logical upgrade from emails where you want to make sure everyone receives it
uh, email is memos? email is so memos that ibm’s proprietary email management solution Lotus Notes calls the transaction “create memo” where outlook calls it “new message”.
and the topic is not up for further discussion.
bit rude, imo.
Sorry, email is still better for all of those things. Except the wiki page, of course.
deleted by creator
I mean, unironically, yeah.
It’s not even that we need to go back to email. The problem isn’t moving on from outdated forms of communication, it’s that the technology being pushed as a replacement for it is throwing out the baby with the bathwater.
Which is to say nothing of the fact that all of these new platforms are proprietary, walled off, and in some cases don’t make controlling the data easy if you’re not hosting it (and their searches are trash).
all of these new platforms are proprietary, walled off, and in some cases don’t make controlling the data easy if you’re not hosting it
You’ve just discovered their business case. So many new businesses these days only insinuate themselves into an existing process in order to co-opt it and charge rents.
Even then, you get banned from Google for some reason, what then?
- Don’t use google as your email provider
- Keep backups of your email (you can do this on gmail, too)
Microsoft is 49.
Yeah. Technically I’m not talking about Microsoft, as their primary product is the OS and they are not purely Internet-based. IBM, of course, is much older than that and also has some Internet products, as does every software company.
In my statement “Internet company” means a company whose only product is SaaS on the Internet; i.e. someone who, if they went away, their product would disappear with them.
I guess it is hard to imagine an internet company lasting that long mostly because the hasn’t been around that long, it’s only been 31 years since it went public. A year later Amazon was formed. I would bet money Amazon and Google easily make it to 50. Along with many many others. A small, not overly commercialized company like slack would be crazy. I wouldn’t be surprised if they get gobbled up by a mega Corp as the enshitification continues.
Google is actually the sine qua non of what I’m talking about. I’ll concede that it’s possible Google as a corporate entity will still exist in 2048 (it was founded in 1998). But Google has undergone such a drastic and dystopian management change that it’s almost not even the same company now–
–but that isn’t relevant to what I’m actually talking about, which is the products. The proposition that Slack logs would still be around 50 years from now was what catalyzed my quip. Google kills everything it makes, usually quickly. Will we be able to look at Google Reader logs in 2048? Or–even closer to the target–Google Wave logs? Google Podcasts? Google Stadia? (I could go on.)
At the end of the day it was just a quip, but I fully expect the SaaS companies you currently think of as indestructible titans to be on the dustheap of history in 20 years, let alone 50.
I don’t think the actual logs on slack will go away. Just maybe hosted on a different server owned by a different corporation.
Match group (owners of nearly every dating site and app) are very likely to endure 50 years, and they are, afaik, 100% internet company, plug it off and they disappear without a trace
And most microsoft products surely can run 50 years with no glitches.
fingers crossed!
They were a software for decades before they became an “internet company”.
IBM is 100, but the Internet didn’t exist in 1924, so we’ll say the clock starts in 1989. I’m pretty sure at least MS or IBM will be around in 15 years.
What does IBM even do anymore? I’m guessing they just support all of their legacy products that customers are locked into.
Bought Red hat and is trying to lock down it’s users.
It’s basically an investment fund that runs the companies it invests in, like Alphabet, but with a bigger mix of real estate and finance investments thrown in.
What’s a company they invest in?
Themselves. I’m just saying internally they run it like an investment fund. Example: https://medium.com/design-ibm/area-631-what-i-learned-in-ibms-start-up-and-innovation-program-d87ed98f9549
deleted by creator
To add to the metal, the blueprints include the blueprints for the processor.
https://hackaday.com/2024/05/06/the-computers-of-voyager/
They don’t use a microprocessor like anything today would, but a pile of chips that provide things like logic gates and counters. A grown up version of https://gigatron.io/
That means “written in assembly” means “written in a bespoke assembly dialect that we maybe didn’t document very well, or the hardware it ran on, which was also bespoke”.
They also released the source code of the Apollo 11 guidance computer. So if you want to fly to the moon, here is one part of how to do it.
Nice, now I just need a rocket and launchpad! Craigslist?
Commission one on fiverr or etsy.
I realize the Voyager project may not be super well funded today (how is it funded, just general NASA funds now?), just wondering what they have hardware-wise (or ever had). Certainly the Voyager system had to have precursors (versions)?
Or do they have a simulator of it today - we’re talking about early 70’s hardware, should be fairly straightforward to replicate in software? Perhaps some independent geeks have done this for fun? (I’ve read of some old hardware such as 8088 being replicated in software because some geeks just like doing things like that).
I have no idea how NASA functions with old projects like this, and I’m surely not saying I have better ideas - they’ve probably thought of a million more ways to validate what they’re doing.
deleted by creator
takes long drag off cigarette “I’m too old for this shit”
You sure? The smell off some of the corpses will have been terrible.
I’m not saying they’re all dead, but an intern at the time of launch would now be 70. Anybody who actually designed anything is… Well… The odds of them still being around are low.
I have a uncle who worked on Apollo writing machine code, and he is a spry, clear-headed 80-something-year-old.
IIRC, they did pull in a guy who had just started his career on the project.
Son of a bitch, I’m in.
do I hear heist movie?
The Hard Fork podcast had a pretty good episode recently where they interviewed one of the engineers on the project. They’d troubleshooted the spacecraft enough in the past that they weren’t starting from square one, but it still sounded pretty difficult.
They apparently didn’t have an emulator. The first thing I’d have done when working on a solution would have been to build one, but they seem to have pulled it off without.
There is an fascinating documentary about the team that sends the commands to Voyager 1 and 2 called It’s Quieter in the Twilight
100% they’ve got an emulator, they’ve had dedicated test environments since the moon landing for emulating disaster recovery scenarios since the moon landings, they’ve likely got at least one functioning hardware replica and very likely can spin up a hardware emulation as a virtual machine at will.
Source: I made this up, but I have a good understanding of systems admin and have a interest in space stuff so I’m pretty confident they would have this stuff at bare minimum
That’s my assumption too, but we’re talking about a different era, and I really have no idea how they approached validation and test/troubleshooting.
I’ve seen some test environments for manned missions, but that’s really for humans to validate what they’re doing.
V’ger was quick 'n dirty by comparison (with no criticism of the process or folks involved…they had one chance to get these missions out there).
To me, the physics of the situation makes this all the more impressive.
Voyager has a 23 watt radio. That’s about 10x as much power as a cell phone’s radio, but it’s still small. Voyager is so far away it takes 22.5 hours for the signal to get to earth traveling at light speed. This is a radio beam, not a laser, but it’s extraordinarily tight beam for a radio, with the focus only 0.5 degrees wide, but that means it’s still 1000x wider than the earth when it arrives. It’s being received by some of the biggest antennas ever made, but they’re still only 70m wide, so each one only receives a tiny fraction of the power the power transmitted. So, they’re decoding a signal that’s 10^-18 watts.
So, not only are you debugging a system created half a century ago without being able to see or touch it, you’re doing it with a 2-day delay to see what your changes do, and using the most absurdly powerful radios just to send signals.
The computer side of things is also even more impressive than this makes it sound. A memory chip failed. On Earth, you’d probably try to figure that out by physically looking at the hardware, and then probing it with a multimeter or an oscilloscope or something. They couldn’t do that. They had to debug it by watching the program as it ran and as it tried to use this faulty memory chip and failed in interesting ways. They could interact with it, but only on a 2 day delay. They also had to know that any wrong move and the little control they had over it could fail and it would be fully dead.
So, a malfunctioning computer that you can only interact with at 40 bits per second, that takes 2 full days between every send and receive, that has flaky hardware and was designed more than 50 years ago.
And you explained all of that WITHOUT THE OBNOXIOUS GODDAMNS and FUCKIN SCIENCE AMIRITEs
Oh screw that, that’s an emotional post from somebody sharing their reaction, and I’m fucking STOKED to hear about it, can’t believe I missed the news!
Finally I can put some take into this. I’ve worked in memory testing for years and I’ll tell you that it’s actually pretty expected for a memory cell to fail after some time. So much so that what we typically do is build in redundancy into the memory cells. We add more memory cells than we might activate at any given time. When shit goes awry, we can reprogram the memory controller to remap the used memory cells so that the bad cells are mapped out and unused ones are mapped in. We don’t probe memory cells typically unless we’re doing some type of in depth failure analysis. usually we just run a series of algorithms that test each cell and identify which ones aren’t responding correctly, then map those out.
None of this is to diminish the engineering challenges that they faced, just to help give an appreciation for the technical mechanisms we’ve improved over the last few decades
pretty expected for a memory cell to fail after some time
50 years is plenty of time for the first memory chip to fail most systems would face total failure by multiple defects in half the time WITH physical maintenance.
Also remember it was built with tools from the 70s. Which is probably an advantage, given everything else is still going
Also remember it was built with tools from the 70s. Which is probably an advantage
Definitely an advantage. Even without planned obsolescence the olden electronics are pretty tolerant of any outside interference compared to the modern ones.
what we typically do is build in redundancy into the memory cells
Do you know how long that has been going on? Because Voyager is pretty old hardware.
Is there a Voyager 1, uh…emulator or something? Like something NASA would use to test the new programming on before hitting send?
Today you would have a physical duplicate of something in orbit to test code changes on before you push code to something in orbit.
Huh. If it survives a few years more, it’s a lightday away.
They have spare Voyager on Earth for debuggingEDIT: or not
Still faster than the average Windows update.
More stable, too.
Absolutely. The computers on Voyager hold the record for being the longest continuously running computer of all time.
Microsoft can’t even release a fix for Window’s recovery partition being too small to stage updates. I had to do it myself, fucking amateurs.
Can’t or won’t? The same issue exists for both windows 10 and 11, but they haven’t closed the ticket for windows 11… Typical bullshit. It’s not exactly planned obsolescence, but when a bug comes up like that they’re just gonna grab the opportunity to go “sry impossible, plz buy new products”
I didn’t know that. So the ticket is still open for 11 but there’s still no fix?
That is my understanding.
I can’t find the article that I read just yesterday, but this is somewhat the same story: https://www.theregister.com/2024/05/03/microsoft_windows_recovery_environment/
Not to mention what a bitch that partition is when you need to shrink or increase the size of your windows partition. If you need to upgrade your storage, or resize to partition to make room for other operating systems, you have to follow like 20 steps of voodoo magic commands to do it.
The possibility of a catastrophic fuck up is way too high to put this on the average Windows user.
Whoa learned that one at the weekend. Added a new nvme drive, cloned the old drive. I wanted to expand my linux partition, but it was at the start of the drive. So shifted all the windows stuff to the end and grew the Linux partition.
Thought I’d boot into windows to make sure it was okay, just in case (even though I’ve apparently not booted it in 3 years). BSOD. 2-3hrs later it was working again, I’m still not sure what fixed it of I’m honest, I seemed to just rerun the same bootrec commands and repair startup multiple times, but it works now, so yay!
Hiren’s Boot Cd has a handy tool that can fix that bsod. I’ve used it many times.
Jeez, I’ve just looked at the list of utilities, I’m not surprised, it’s got FireWire drivers for dos included. You’ve got to be pretty deep into the weeds at the point you need FireWire support in DOS from a recovery disk!
NASA should be in charge of Windows updates!
If they were it wouldn’t be Windows
Windows 13 update log:
Change kernel to Linux.
Build custom OS for astrophysics and space science applications.
happy rocket engineer noises
Now I’m curious. How would a NasaOS look like? Would it even be good for general use? Would they just focus on optimization? Could they finally beat Hannah Montana linux, the superior OS?
I think it would have a real time kernel running parallel to a linux kernel.
Users could interact with the linux kernel normally and schedule trusted real time tasks on the other. Maybe there is reduced security for added performance on those cores.In general use it would be a normal stable system with the allure of a performance mode that will break your system if you are not careful.
Certainly better tested.
Well, they only had to test it for a single hardware deployment. Windows has to be tested for millions if not billions of deployments. Say what you want, but Microsoft testers are god like.
Windows? Hardware testing? Testing in general? LMAOOO
Interviewer: Tell me an interesting debugging story
Interviewee: …
Heh. Years ago during an interview I was explaining how important it is to verify a system before putting it into orbit. If one found problems in orbit, you usually can’t fix it. My interviewer said, “Why not just send up the space shuttle to fix it?”
Well…
It’s hard to explain how significant the Voyager 1 probe is in terms of human history. Scientists knew as they were building it that they were making something that would have a significant impact on humanity. It’s the first man made object to leave the heliosphere and properly enter the interstellar medium, and this was always just a secondary goal of the probe. It was primarily intended to explore the gas giants, especially the Jovian lunar system. It did its job perfectly and gave us so many scientific discoveries just within our solar system.
And I think there’s something sobering about the image of it going on a long, endless road trip into the galactic ether with no destination. It’s a pretty amazing way to retire. The fact that even today we get scientific data from Voyager, that so far away we can still communicate with it and control it, is an unbelievable achievement of human ingenuity and scientific progress. If you’ve never seen the image the Pale Blue Dot you should see it. That linked picture is a revised version of the image made by Nasa and released in 2020. It’s part of a group of the last pictures ever taken by Voyager 1 on February 14th 1990, a picture of Earth from 6 billion kilometers away. It’s one of my favorite pictures, and it kinda blows my mind every time I see it.
The pale blue dot photo always makes me tear up. We’re so small and insignificant in such a grand universe and I’m crushed that I can’t explore it.
There will always be a “step further we’d love to see but won’t”. Let’s be glad we’re in that step which included this photo and the inherent magnificence in it.
It totally beats being one of the earlier humans who just wondered what the lights in the sky might be. Probably gods or something.
There will always be a “step further we’d love to see but won’t”
I dunno, it could be really bad out there. We like to have really romanticized versions of space exploration in our brain. Like finding I habitable planets and other intelligent life. But what if that other intelligent life is super far advanced, and also capitalists. And they figured out how to inject advertisements into brains. And they want to share their technology with us.
Let’s hope we figure something out before every other Galaxy moves away from us faster than the speed of light.
Then we would want to see, what’s past that.
If they’d be super far advanced, they most likely won’t be capitalists 😁
I think the term “metal” is overused, but this is probably the most metal thing a programmer could possibly do besides join a metal band.
Or activate Skynet.
I was already impressed when they managed to diagnose a single bit flip a few years ago.
Keep in mind too these guys are writing and reading in like assembly or some precursor to it.
I can only imagine the number of checks and rechecks they probably go through before they press the “send” button. Especially now.
This is nothing like my loosey goosey programming where I just hit compile or download and just wait to see if my change works the way I expect…
they almost certainly have a hardware spare, or at the very least, an accurately simulated version of it, because again, this is 50 year old hardware. So it’s pretty easy to just simulate it.
But yeah they are almost certainly pulling some really fucked QA on this shit.
deleted by creator
NASA has claimed to have never written a bug in a shipped piece of code from what i can recall off the top of my head.
Rejected : please comment your changes
Man I can’t even get my stupid Azure deployment to work and that’s only in Germany.
As someone who recently switched from AWS to Azure I feel your pain.
Best part is when you finally have a working solution, Microsoft sends you an email that it’s being deprecated.
What’s wrong with AWS in your usecass?
Oh I switched jobs, so not switch as in migrate.
The industry I work in now is very conservative, so Microsoft is a brand people know and “trust”. Amazon is scary and new.
Chances are that Microsoft won management over with discounts.
I wont even upgrade the BIOS on my motherboard because im afraid of bricking it.
As a teenager I experienced a power outage while I was updating my bios.
Guess what happened?
I’m still bitter about it.
You can negate that risk by getting a UPS. You should get a UPS in any case imo since even a shitty one lets you at least save your work and shutdown properly if your electricity drops.
Oh yeah, I learned that lesson.
I got a big mean one these days.
I updated mine a couple of weeks ago. I was actually really anxious as It went through the process, but it worked fine, at first…
Then I found out Microsoft considered it a new computer and deactivated windows. (And that’s when I found out they deleted upgrade licences from windows 7 & 8 back in September)That’s Microsoft in a nutshell for ya.
Posting from Linux then?
Well, a “free” OS anyway.
Buy spare flash chip
Why do Tumblr users approach every topic like a manic street preacher?
There’s a significant overlap between theatre kids and Tumblr users.
That ven diagram is maybe 3 degrees away from a circle.
A Venn diagram is not a pie chart, they’re all circles.
Like so much overlap of the two circles, it’s almost 1 circle.
Yeah, and we might use a ratio to describe that overlap, not degrees.
The area where it overlaps sometimes isn’t a circle.
Thank you, now I can’t stop hearing them in Alan Tudyk’s Clayface voice from the Harley Quinn series…
My understanding is that they sent V’Ger a command to do “something,” and then the gibberish it was sending changed, and that was the “here’s everything” signal.
And yeah, I’m calling it V’Ger from now on.
They specifically sent it a command to send a full memory dump after it went haywire. It wasn’t a fluke.
Sure - I didn’t know what “something” was. And what I’d read was that someone had to figure out that they were receiving a full memory dump, which suggested to me that they hadn’t specifically asked for that.
And yeah, I’m calling it V’Ger from now on
Have my upvote.
Why haven’t we been doing this already? I’m with you, let’s make this happen!
Only so long as we ensure we keep a stable humpback whale population. I don’t wanna be the guy that has to figure out how to make a temporal slingshot maneuver work.
This seems to be positive news on that front.
Are we really gonna have to have a time travel based Star Trek Movie for all the species out there to manage to get around to fixing climate change?
From what I read there was damage to the memory in certain places so they’ve had to move the code into spare places in memory.
It’s an astounding feat tbh.
One specific chip had damaged memory
The vaginer
When I hear what they did, I was blown away. A 50 year old computer (that was probably designed a decade before launching) and the geniuses that built that put in the facility to completely reprogram it a light-day away.
Great documentary on the Voyager team: It’s quieter in the twilight
I prefer the sequel Star Trek: the motion picture.
V’ger 2: 2patch2furious
Bu… but there… there already was a Star Trek Voyager?