This talk about off-the-shelf hardware in space makes me wonder, given the clear line of sight, if it would be possible to detect their Wi-Fi access points' beacons from Earth. I'm not a "radio guy" and don't know if this would be impossible, simply on the basis of physics, due to the presumably low radiated power from the APs and the limitations of the size of typical antennas on the ground. (Obviously it's possible with the right equipment. We can communicate with the Voyager probes, but that's not with a "can-tenna" and an off-the-shelf Wi-Fi card...)
Edit: Anybody know how difficult it would be to keep an antenna pointed at them? I have no intuition for how fast their transit would be. I assume, since an orbit is around 90 minutes, pretty damned fast.
Edit 2: Some search-engining and back-of-the-envelope not-very-good-at-trig math says the longest possible transit would be about 5 minutes, moving though about 40 degrees of arc / minute. I'm probably completely talking out my ass, though.
It feels like it would be do-able to keep a directional antenna trained on a target moving at that speed.
Probably not possible. Their Wi-Fi access point is inside the capsule, the capsule is made from metal and probably shielding the signal somewhat. Maybe even quite a lot if it's intended to provide some radiation shielding. Also it's low power as it only needs to work inside the capsule, at the given distances signal attenuation will make it almost impossible to pick up anything.
Oh ya I remember how some computer pulled a windows update over a satellite connection during a research flight (aircraft). That was super expensive, wow. Now Microsoft servers are banned at the outgoing point since you couldn’t reliably stop it the computer itself and new teams with new computers come in.
I'm not letting Microsoft off the hook here, but if you have an expensive metered connection and you're trusting clients (especially a modern personal computer of any operating system type)to play nicely with bandwidth, that's 100% on you.
That's a really sorry state of things, then. There's zero trust in software now, in the literal sense. How did it get that we live in a world where you can't trust a client to enforce its own documented behavior? How did it get to be the user's fault for not using OS and hardware level measures and not the software vendor's fault when the "Automatic updates" toggle is a no-op?
Software design is not really my wheelhouse so I can't comment meaningfully on that, but on the networking side I can very confidently say it was a poor architecture. You simply cannot assume that all of your clients are going to be both 1) non-malicious and 2) work exactly as you think they will.
Link saturation would be one of the first things that would come to mind in this situation, and at these speeds QoS would be trivial even for cheap consumer hardware.
Well, on the software design side, there's plenty of scenarios where undocumented behavior crops up on unexpected network interruption. In the example above, Windows can even pre-download updates on metered connections during one time period, then install those updates during another. The customers really can't take the blame for that, IMO.
I think overall society has rapidly deteriorated in software quality and it is mostly because of the devaluing of software design. No one expects quality from software, everyone "understands there are bugs", and some like to take advantage of that. And so the Overton window gets pushed in the direction of "broken forever good luck holding the bag if you use it" rather than the more realistic "occasionally needs to restart IFF you hit an issue and it takes less than <10 seconds and has minimal data loss".
MBAs/consultants hijacked the industry along with an influx of people that only consider leetcode to be sufficient for hiring. The past 10 years has been a major injunction of these people into big tech. The resulting mess is predictable, it'll get worse too which is why we need to break up these companies and allow better more efficient companies to take their place rather than letting them subsidize their failures with their monopolies.
In an environment where bandwidth utilization costs money I think it's a good belt-and-suspenders approach, regardless of the expected behavior of the clients, to enforce policy at the choke point between expensive and not-expensive.
(I think more networks should be built with default deny egress policies, personally. It would make data exfiltration more difficult, would make ML algorithms monitoring traffic flows have less "noise" to look thru, and would likely encourage some efficiency on the part of dependencies.)
One is hacked by a Russian hacker group based in St. Petersburg, the other is hacked by a Chinese hacker group, and the third instance was actually BackOrifice but it couldn't get enough resources to run because of the other two.
At a previous job I was a developer on a medical instrument that used Windows to run the UI.
Before everyone gets all up in arms about it, Windows/Linux UI & database with external microcontrollers handling real-time control is a very common architectural choice for medical and industrial equipment. To the point where many Systems-on-Module (SoMs) come with a Linux-capable ARM processor and a separate, smaller processor for real-time, linked via shared memory.
Anyway, a customer called to report a weird bug that we couldn't resolve. After remoting into the instrument, we discovered that one of the lab technicians had attempted to install Excel on it. At some point the install must have failed, but it left a .dll behind that was causing a conflict with something in our code and keeping the UI from starting properly.
No, we did not learn anything from this incident...
Isn't this what Embedded Windows was always for, like for use in medical equipment, ATMs, POS, PLC, oscilloscopes, etc? Basically stuff that's supposed to be fire-and-forget, run 24/7 and that the user shouldn't be able to tinker with.
And also what group policies were for, that can disable the user from installing any software?
Am I wrong to assume that the fuckup was on your end, for using the wrong tool and not configuring it properly?
> Am I wrong to assume that the fuckup was on your end, for using the wrong tool and not configuring it properly?
Not at all. I agree that it should have been locked down and only privileged accounts should be allowed software update. But the system auto-booted into an Administrator account so it really wasn't a surprise that eventually someone would do something stupid.
I will say that this was for Windows NT retail, not Windows NT Embedded. At that point, getting an NT Embedded license practically required sacrificing your firstborn child. It was only when Microsoft got to Win XP Embedded that the license didn't look like it was written by a team of lawyers who already knew that they were perpetually in Hell.
Memories now of what we were given at the hospital long ago: our obstetrics ward was using Philips OBTraceVue software. The original FDA-approved system required Philips to package the OS and hardware all together, so we were given a bunch of generic Compaq desktops to run their fetal heartrate monitoring on.
The biggest annoying complaint was "we want to run our EHR software on it!" but because of the FDA requirements, we weren't allowed to install anything on the box. Yet somehow providing AV could be OK in some cases, and in other cases installing Citrix? And then we'd somehow find out someone managed to install the EHR client onto it anyways and it became a big old mess to have to have Philips come send a tech out of their own to reimage a PC we couldn't "legally" service.
It was a big messy pain for a while back in the day. Was happy when we finally got to upgrade to the newer IntelliSpace software on our own PCs in the ward. (Also got to meet a support engineer that came out rocking an Agilent badge, so that was super cool on its own right of history...)
> somehow providing AV could be OK in some cases, and in other cases installing Citrix?
The only way this could possibly have passed FDA scrutiny would be if the original manufacturer had validated this particular system configuration and approved it.
There's probably tons of stuff like this going on all over the place, but it manages to say under the radar, so no one notices it. But with the FDA's increased scrutiny on cybersecurity it will eventually disappear.
Back in like early aughts I remember seeing an ATM in Rome that had evidently crashed and was sitting at a DOS prompt. I was much younger then, but I remember thinking it wasn't terribly surprising, but it was also a bit of a wizard of oz moment.
Everyone likes to point and laugh, sure, I'm getting a chuckle as well.
However, on more practical level, what are other options? Outlook, the desktop application works really well with local copies, is pretty low bandwidth and very familiar to end users.
IMAP with Thunderbird is probably only other option that would satisfy the requirements.
EDIT: Yes they need to get email in space. It's easy way to send documents back and forth.
> Yes they need to get email in space. It's easy way to send documents back and forth.
To me that's probably much more interesting. We assume they have all this fancy NASA tech, probably some special communication protocols, but nope, email is fine. Still not sure why they'd use Outlook, but I guess it's easier than retraining astronauts on Alpine or Mutt.
How long did the US military rely on mIRC... decades, maybe they still do?
Yeah, the only other option I’d consider for this would be Apple Mail on an iPad for the same reason that it works well offline or with low bandwidth networks. There’s a QA issue here but the logic is quite reasonable.
I don't know why people are surprised by this. Using suitable off-the-shelf solutions for non-mission-critical purposes seems like a very reasonable thing to do.
I'm recalling this from my memory of "The Space Above Us" podcast: There were various bespoke teleprinters sent up on early shuttle flights that had exciting failure modes (if I remember correctly one of them started smoking) and in at least a couple of cases they had to stow the new hardware and pull out the old backup hardware because the new stuff didn't work.
I quit Outlook and went to Thunderbird when I upgraded my CPU and Microsoft told me I had to re-purchase Outlook when I had paid for a "lifetime license". That was the last straw for me. I installed Linux and Thunderbird and have not looked back at Windows.
We migrated earlier this year and had a similar problem. Outlook (classic) works differently than the OWA version. They keep the classic version so people don't spontaneously throw a chair out a window. It's being phased out slowly.
It depends on how badly Microsoft continues to fuck up Outlook (classic).
I don't use Outlook for my personal email, but I've used it in various corporate engagements and not been wholly dissatisfied. Newer versions are slower, more bloated, and unstable (though add-ins-- especially the Teams add-in-- contribute to that).
The most egregious regression, for me, has been the "Advanced Find" functionality (which was wonderful in the 97 thru 2010 versions) being changed-out for the god-awful search box within the Outlook window.
We could have said that for publisher a few years back. Its death knell has been sounded and microsoft aren't even offering any way for people to properly view or print their publisher files, let alone edit them.
Which "new" Outlook? I think there's like 3 versions of Outlook currently on the market. The Classic Win32 one they want you to stop using, the new Lite variant bundled for free with Windows 11, and the new Full Spec one that comes with Office 365, both of which are built on web technologies IIRC.
Its the fucking federal government's policy. Basically it amounts to "pay microsoft as a form of corporate welfare", "permit but not really allow linux", and "this is how it has always been done".
And also because apparently "nobody has ever been fired for choosing Microsoft", which is something that should start happening more often if you ask me
As long as Linux distros have such shit accessibility stories, MacOS and Windows being available should be a requirement for all systems in government.
The poor technicians having to RDP with (what I imagine must be) a horrible latency. Although still might be better than some corporate environments lol
A program which can covertly transmit itself between computers via networks (especially the Internet) or removable storage such as CDs, USB drives, floppy disks, etc., often causing damage to systems and data.
A software program capable of reproducing itself and usually capable of causing great harm to files or other programs on the same computer.
It doesn't seem like they are trying to figure out why two copies of outlook are installed, they're trying to figure out why neither is giving them access to their email.
People opening the "wrong" Outlook has been the norm for the last couple of years. Between "Outlook (classic)", "new" Outlook (rolled out with Office 365 clients), and "Outlook" (rolled out with Windows 11) it's been a shit show for a while now.
Is this actually true? What's next? A BSOD? I would have ever ever in my life bet that Microsoft software could be shipped in a spacecraft carrying human beings. Unbeliveable.
I want to say something like "oh well, this is certainly a non-critical piece of software". Hopefully it's the convenient dashboard and there are other, more hardened consoles for fallback or something.
But in all seriousness, and without glibness or sarcasm: I cannot comprehend how there is any "unexpected" software running on that spacecraft, regardless of operating system.
EDIT*** For those who like me only watched the video and didn't read the thread: This is on a laptop that is non-critical, it is not a part of the spacecraft. Whew. Now I'm sad that one of the Linux distros didn't try to pitch themselves to the astronauts for a sponsorship... Would have been especially on brand for Pop_OS.
Why in the name of all that's holy would you use a Microsoft product on a mission like this? Just about the only thing you can trust about MS is that their software is buggy.
Because they have the power to insert themselves in places like these. It's a bigger problem. There are places in which companies with Microslop's level of quality have no business to be, but they're already there.
I believe that the use of Windows NT for Aegis control was fleet-wide, so that problem wasn't unique to the Yorktown. That just happened to be where it was discovered.
> The thing about Space is that it's just so huge. Unbelievably so. And the real challenge? You have to make all your delta-V for orbital speed by pushing gas very fast. In one go.
I think we need to mandate intentionally slower, sandboxed, and resource-constrained development environments/containers so developers are unable to abuse resources like they're "free" and in so using wasteful and improper algorithms to expand to fill the volume of the container (RAM, CPU, IOPS, storage capacity, and network bandwidth and latency) like an ideal gas. Lazy coding and excessive abstractions on top of VMs on top of more abstractions all the way down leads to shit.
This comment makes it feel a lot safer, when you think about it.
"Web browsers are historically known for crashing, but that's partly because they have to handle every page on the whole Internet. A static system with the same browser running a single website, heavily tested, may be reliable enough for our needs."
When you've also built up the metal that you're running that React on, it's a lot warmer and cozier than having to trust the whole fat Windows 11 codebase on Artemis...
This talk about off-the-shelf hardware in space makes me wonder, given the clear line of sight, if it would be possible to detect their Wi-Fi access points' beacons from Earth. I'm not a "radio guy" and don't know if this would be impossible, simply on the basis of physics, due to the presumably low radiated power from the APs and the limitations of the size of typical antennas on the ground. (Obviously it's possible with the right equipment. We can communicate with the Voyager probes, but that's not with a "can-tenna" and an off-the-shelf Wi-Fi card...)
Edit: Anybody know how difficult it would be to keep an antenna pointed at them? I have no intuition for how fast their transit would be. I assume, since an orbit is around 90 minutes, pretty damned fast.
Edit 2: Some search-engining and back-of-the-envelope not-very-good-at-trig math says the longest possible transit would be about 5 minutes, moving though about 40 degrees of arc / minute. I'm probably completely talking out my ass, though.
It feels like it would be do-able to keep a directional antenna trained on a target moving at that speed.
Probably not possible. Their Wi-Fi access point is inside the capsule, the capsule is made from metal and probably shielding the signal somewhat. Maybe even quite a lot if it's intended to provide some radiation shielding. Also it's low power as it only needs to work inside the capsule, at the given distances signal attenuation will make it almost impossible to pick up anything.
Oh ya I remember how some computer pulled a windows update over a satellite connection during a research flight (aircraft). That was super expensive, wow. Now Microsoft servers are banned at the outgoing point since you couldn’t reliably stop it the computer itself and new teams with new computers come in.
I'm not letting Microsoft off the hook here, but if you have an expensive metered connection and you're trusting clients (especially a modern personal computer of any operating system type)to play nicely with bandwidth, that's 100% on you.
That's a really sorry state of things, then. There's zero trust in software now, in the literal sense. How did it get that we live in a world where you can't trust a client to enforce its own documented behavior? How did it get to be the user's fault for not using OS and hardware level measures and not the software vendor's fault when the "Automatic updates" toggle is a no-op?
Software design is not really my wheelhouse so I can't comment meaningfully on that, but on the networking side I can very confidently say it was a poor architecture. You simply cannot assume that all of your clients are going to be both 1) non-malicious and 2) work exactly as you think they will.
Link saturation would be one of the first things that would come to mind in this situation, and at these speeds QoS would be trivial even for cheap consumer hardware.
Well, on the software design side, there's plenty of scenarios where undocumented behavior crops up on unexpected network interruption. In the example above, Windows can even pre-download updates on metered connections during one time period, then install those updates during another. The customers really can't take the blame for that, IMO.
I think overall society has rapidly deteriorated in software quality and it is mostly because of the devaluing of software design. No one expects quality from software, everyone "understands there are bugs", and some like to take advantage of that. And so the Overton window gets pushed in the direction of "broken forever good luck holding the bag if you use it" rather than the more realistic "occasionally needs to restart IFF you hit an issue and it takes less than <10 seconds and has minimal data loss".
MBAs/consultants hijacked the industry along with an influx of people that only consider leetcode to be sufficient for hiring. The past 10 years has been a major injunction of these people into big tech. The resulting mess is predictable, it'll get worse too which is why we need to break up these companies and allow better more efficient companies to take their place rather than letting them subsidize their failures with their monopolies.
> How did it get that we live in a world where you can't trust a client to enforce its own documented behavior?
My guess a combo of economic incentives and weak legal protections.
I realize that answer applies to so many issues as to be almost not worth saving, but I think it's still true here.
In an environment where bandwidth utilization costs money I think it's a good belt-and-suspenders approach, regardless of the expected behavior of the clients, to enforce policy at the choke point between expensive and not-expensive.
(I think more networks should be built with default deny egress policies, personally. It would make data exfiltration more difficult, would make ML algorithms monitoring traffic flows have less "noise" to look thru, and would likely encourage some efficiency on the part of dependencies.)
The astronaut's quote needs to be a billboard ad.. "I also see I have 2 instances of Outlook, and neither of those are working".
One is hacked by a Russian hacker group based in St. Petersburg, the other is hacked by a Chinese hacker group, and the third instance was actually BackOrifice but it couldn't get enough resources to run because of the other two.
Houston, we have a problem!
Why on God's green earth is Windows running on the Artemis spaceship?
What should they use for email?
Literally anything else. In 1992 we did email on the command line with green screen terminals
Pine
I am still using mutt.
Alpine & Linux?
Are you running that on a flight system or on an additional computer where it would be fine to run Windows?
Why do they “need” email access in the first place?! What the actual fuck.
Why the poverty mindset? If we are gonna joy ride around the moon we should at least do it in style.
Bashing on MS products and on ReactJS (apparently used by spacex UIs) is a common pastime here and I'm guilty of it myself.
But here we're talking about actual space rockets flying to space with humans in them.
My expectation would be that something like https://tigerstyle.dev/ would be followed or the NASA rules linked from there https://spinroot.com/gerard/pdf/P10.pdf
At a previous job I was a developer on a medical instrument that used Windows to run the UI.
Before everyone gets all up in arms about it, Windows/Linux UI & database with external microcontrollers handling real-time control is a very common architectural choice for medical and industrial equipment. To the point where many Systems-on-Module (SoMs) come with a Linux-capable ARM processor and a separate, smaller processor for real-time, linked via shared memory.
Anyway, a customer called to report a weird bug that we couldn't resolve. After remoting into the instrument, we discovered that one of the lab technicians had attempted to install Excel on it. At some point the install must have failed, but it left a .dll behind that was causing a conflict with something in our code and keeping the UI from starting properly.
No, we did not learn anything from this incident...
That wasn't a Therac-25 by any chance?
Sorry. Couple decades too late.
Isn't this what Embedded Windows was always for, like for use in medical equipment, ATMs, POS, PLC, oscilloscopes, etc? Basically stuff that's supposed to be fire-and-forget, run 24/7 and that the user shouldn't be able to tinker with.
And also what group policies were for, that can disable the user from installing any software?
Am I wrong to assume that the fuckup was on your end, for using the wrong tool and not configuring it properly?
> Am I wrong to assume that the fuckup was on your end, for using the wrong tool and not configuring it properly?
Not at all. I agree that it should have been locked down and only privileged accounts should be allowed software update. But the system auto-booted into an Administrator account so it really wasn't a surprise that eventually someone would do something stupid.
I will say that this was for Windows NT retail, not Windows NT Embedded. At that point, getting an NT Embedded license practically required sacrificing your firstborn child. It was only when Microsoft got to Win XP Embedded that the license didn't look like it was written by a team of lawyers who already knew that they were perpetually in Hell.
Memories now of what we were given at the hospital long ago: our obstetrics ward was using Philips OBTraceVue software. The original FDA-approved system required Philips to package the OS and hardware all together, so we were given a bunch of generic Compaq desktops to run their fetal heartrate monitoring on.
The biggest annoying complaint was "we want to run our EHR software on it!" but because of the FDA requirements, we weren't allowed to install anything on the box. Yet somehow providing AV could be OK in some cases, and in other cases installing Citrix? And then we'd somehow find out someone managed to install the EHR client onto it anyways and it became a big old mess to have to have Philips come send a tech out of their own to reimage a PC we couldn't "legally" service.
It was a big messy pain for a while back in the day. Was happy when we finally got to upgrade to the newer IntelliSpace software on our own PCs in the ward. (Also got to meet a support engineer that came out rocking an Agilent badge, so that was super cool on its own right of history...)
> somehow providing AV could be OK in some cases, and in other cases installing Citrix?
The only way this could possibly have passed FDA scrutiny would be if the original manufacturer had validated this particular system configuration and approved it.
There's probably tons of stuff like this going on all over the place, but it manages to say under the radar, so no one notices it. But with the FDA's increased scrutiny on cybersecurity it will eventually disappear.
Back in like early aughts I remember seeing an ATM in Rome that had evidently crashed and was sitting at a DOS prompt. I was much younger then, but I remember thinking it wasn't terribly surprising, but it was also a bit of a wizard of oz moment.
this is a crew laptop and not a mission critical computer at all.
Everyone likes to point and laugh, sure, I'm getting a chuckle as well.
However, on more practical level, what are other options? Outlook, the desktop application works really well with local copies, is pretty low bandwidth and very familiar to end users.
IMAP with Thunderbird is probably only other option that would satisfy the requirements.
EDIT: Yes they need to get email in space. It's easy way to send documents back and forth.
> Yes they need to get email in space. It's easy way to send documents back and forth.
To me that's probably much more interesting. We assume they have all this fancy NASA tech, probably some special communication protocols, but nope, email is fine. Still not sure why they'd use Outlook, but I guess it's easier than retraining astronauts on Alpine or Mutt.
How long did the US military rely on mIRC... decades, maybe they still do?
I'd ask the opposite question. Why would they not use Outlook and instead use something like Alpine or Mutt? This is 2026, you know.
US Military still uses IRC/mIRC for similar reasons. Easy to self host and it's low bandwidth.
Yeah, the only other option I’d consider for this would be Apple Mail on an iPad for the same reason that it works well offline or with low bandwidth networks. There’s a QA issue here but the logic is quite reasonable.
I don't know why people are surprised by this. Using suitable off-the-shelf solutions for non-mission-critical purposes seems like a very reasonable thing to do.
I'm recalling this from my memory of "The Space Above Us" podcast: There were various bespoke teleprinters sent up on early shuttle flights that had exciting failure modes (if I remember correctly one of them started smoking) and in at least a couple of cases they had to stow the new hardware and pull out the old backup hardware because the new stuff didn't work.
I quit Outlook and went to Thunderbird when I upgraded my CPU and Microsoft told me I had to re-purchase Outlook when I had paid for a "lifetime license". That was the last straw for me. I installed Linux and Thunderbird and have not looked back at Windows.
We migrated earlier this year and had a similar problem. Outlook (classic) works differently than the OWA version. They keep the classic version so people don't spontaneously throw a chair out a window. It's being phased out slowly.
I'm betting in 15 years, people will still be using Outlook (classic). This is the culture.
It depends on how badly Microsoft continues to fuck up Outlook (classic).
I don't use Outlook for my personal email, but I've used it in various corporate engagements and not been wholly dissatisfied. Newer versions are slower, more bloated, and unstable (though add-ins-- especially the Teams add-in-- contribute to that).
The most egregious regression, for me, has been the "Advanced Find" functionality (which was wonderful in the 97 thru 2010 versions) being changed-out for the god-awful search box within the Outlook window.
We could have said that for publisher a few years back. Its death knell has been sounded and microsoft aren't even offering any way for people to properly view or print their publisher files, let alone edit them.
The culture is correct, the new version of Outlook is hot garbage
Which "new" Outlook? I think there's like 3 versions of Outlook currently on the market. The Classic Win32 one they want you to stop using, the new Lite variant bundled for free with Windows 11, and the new Full Spec one that comes with Office 365, both of which are built on web technologies IIRC.
Its the fucking federal government's policy. Basically it amounts to "pay microsoft as a form of corporate welfare", "permit but not really allow linux", and "this is how it has always been done".
And also because apparently "nobody has ever been fired for choosing Microsoft", which is something that should start happening more often if you ask me
As long as Linux distros have such shit accessibility stories, MacOS and Windows being available should be a requirement for all systems in government.
They also keep their own inboxes; emails downloaded to or sent from the old version are not visible on the new version.
The poor technicians having to RDP with (what I imagine must be) a horrible latency. Although still might be better than some corporate environments lol
At the time they were ~57,000km out and I calculated it was at least 380ms RTT to the ground receiver, so bad but not unusable.
At its current distance, best case RTT would be about 420ms
That wouldn't be terrible to use. I feel like I've done worse supporting in-cab computers on fleet vehicles across 3G cellular.
Keyboard shortcuts and "caching" the state of the remote client in your mind are the keys to doing that work.
computer virus
noun
Maybe for emails and calendars, wouldn't want them to arrive and miss the appointment.
Coordinating time-zone issues between remote meeting participants on a single celestial body is complex enough. >smile<
I don't understand the title.
It doesn't seem like they are trying to figure out why two copies of outlook are installed, they're trying to figure out why neither is giving them access to their email.
People opening the "wrong" Outlook has been the norm for the last couple of years. Between "Outlook (classic)", "new" Outlook (rolled out with Office 365 clients), and "Outlook" (rolled out with Windows 11) it's been a shit show for a while now.
Is this actually true? What's next? A BSOD? I would have ever ever in my life bet that Microsoft software could be shipped in a spacecraft carrying human beings. Unbeliveable.
I didn't expect they are running Windows up there. Shouldn't be specialized and curated ... smthing else?
the benchmarks is helpful but i'd want to see how it handles edge cases
I want to say something like "oh well, this is certainly a non-critical piece of software". Hopefully it's the convenient dashboard and there are other, more hardened consoles for fallback or something.
But in all seriousness, and without glibness or sarcasm: I cannot comprehend how there is any "unexpected" software running on that spacecraft, regardless of operating system.
EDIT*** For those who like me only watched the video and didn't read the thread: This is on a laptop that is non-critical, it is not a part of the spacecraft. Whew. Now I'm sad that one of the Linux distros didn't try to pitch themselves to the astronauts for a sponsorship... Would have been especially on brand for Pop_OS.
They have a diverse range of devices, including iPhones
It'd be fun to be a telco tech and realize that you've an attempted connection from an iPhone 50 miles up and receding fast.
And connected to the network!
From the comments:
Andy Meyers @andymeyers10.bsky.social · 3h I said “launch window”, not “Launch Windows”!
Copilot (which one?!?) says "I'm sorry Dave, I cannot allow you to do that"
Why in the name of all that's holy would you use a Microsoft product on a mission like this? Just about the only thing you can trust about MS is that their software is buggy.
Because they have the power to insert themselves in places like these. It's a bigger problem. There are places in which companies with Microslop's level of quality have no business to be, but they're already there.
USS Yorktown, the aegis missile carrier comes to mind for some reason.
I think it was the same ship which shot down a passenger airliner.
I believe that the use of Windows NT for Aegis control was fleet-wide, so that problem wasn't unique to the Yorktown. That just happened to be where it was discovered.
“Guns don’t kill people, Windows does”?
Is it just me that finds it terrifying that theres any Windows bits on a spaceship?
Clippy: “Hi, it looks like you’re trying to go to the Moon”
Ugh. Actually...
> The thing about Space is that it's just so huge. Unbelievably so. And the real challenge? You have to make all your delta-V for orbital speed by pushing gas very fast. In one go.
>"Is it just me that finds it terrifying that theres any Windows bits on a spaceship?"
SpaceX Crew Dragon console interfaces are entirely React apps
Everything is terrifying in computing these days, and bringing it to space rockets makes it even more terrifying.
I think we need to mandate intentionally slower, sandboxed, and resource-constrained development environments/containers so developers are unable to abuse resources like they're "free" and in so using wasteful and improper algorithms to expand to fill the volume of the container (RAM, CPU, IOPS, storage capacity, and network bandwidth and latency) like an ideal gas. Lazy coding and excessive abstractions on top of VMs on top of more abstractions all the way down leads to shit.
https://news.ycombinator.com/item?id=41655299
This comment makes it feel a lot safer, when you think about it.
"Web browsers are historically known for crashing, but that's partly because they have to handle every page on the whole Internet. A static system with the same browser running a single website, heavily tested, may be reliable enough for our needs."
When you've also built up the metal that you're running that React on, it's a lot warmer and cozier than having to trust the whole fat Windows 11 codebase on Artemis...
Blame the gaggle of idiots from that slop thread the other day.
Now that the clowns are running the circus, I suspect digital goods will begin to rapidly decay soon.
There was a literal meme in spaceforce about this. Have we learnt nothing ?
Microslop will now troll people outside of the Earth, a great achievement for them.
So does this mean they now also have... 2 Copilots... ? Terrible joke.
Did they consider scrapping the humans, and just installing co-pilot? heh .. heh.. /s
We can't even leave the planet without MS enshittifying our equipment. God, I really want out of this timeline
Just imagine the aliens capture a probe and try to use Windows. What will they think of us?!
Don't worry, it will stop them at installation and demand internet access and creation of a Microsoft cloud account.
First they laugh.
Then they wanna just cry when it brings down their whole starfleet with a virus that they have no immunity to ;)
I'm so tired.
Clippy "It looks like you're trying to go to the Moon. Want any help with that?"