Only after 20 years. Light will take 10y to make it from earth to the mirror, and 10y to travel back.
Only after 20 years. Light will take 10y to make it from earth to the mirror, and 10y to travel back.
Well, first of all China does make lithography equipment (for instance, Shanghai Micro Electronics Equipment, who are currently at 28 nm). There are a couple of others iirc, and they typically got started by licensing lithography technology from Japanese companies and then building on it.
The issue is mostly one of economics – fabs want higher-resolution lithography as soon as possible, and they only buy it once, which means that the first company to develop new litho technologies takes the lions share of the revenue. If you’re second to the technology, or are more than half a dozen nodes behind like SMEE is, theres not a lot of demand because there are fabs full of litho machines from when that node was new, and theres not as much demand for them anymore.
The issue with a new company making leading edge nodes is the incredible R&D and development cost involved. Nikon, Canon, and ASML shared the market when they all started developing EUV tech, and it took ASML 15+ years to develop it! Canon and Nikon teamed up, spent tens of billions of dollars on R&D, and dropped out once they realized they couldn’t beat ASML to market because there wouldn’t be enough market left for them to make their money back.
If you want to learn more about the history of the semiconductor industry, I recommend the Asianometry YouTube channel!


Knowing how to write a good study is a matter of experience more than intelligence.


Yup, and the price of the Xbox Ally is ridiculous, as expected!


Openness is great, but there’s no financial reason to make specialized hardware to operate an open platform.
Historically, consoles have been sold near cost, and profits have been made on game sales after the fact. If you can just buy your games from Steam on console, the price of the console will go up. At some point, it no longer makes sense to buy the specialized hardware.
But we’ll get to see how that goes! It’s looking more and more like the next Xbox is going to run Windows.
If you are truly starting from scratch, shooting for Raspberry Pi performance isn’t starting small, thats a huge goal. It’s a complex chip built on a fairly modern process node (28 nm for the 4B) using the second-best-established architecture.
The reasonable goal to shoot for would be an 8086-like chip, then perhaps something like a K3-II or early Pentium, then slowly work your way up from there.
There are a couple of further questions to be able to answer this best. First, when you say using only tech that is in the open, nothing proprietary, how strictly do you mean that? Historically, what Chinese foundries have done is buy a fab line far enough from the leading edge to not be questioned, then use that as a starting point for working towards smaller nodes. If thats allowed, it would be fairly trivial, 40 nm doesnt perform that badly.
If you want the equivalent of “open-source” fab equipment, as far as I know that has never existed. In better news, if you go back to DUV/immersion lithography, its not just ASML manufacturing lithography, Nikon and Canon were still in the game, so power was less centralized.
Second, what is the actual goal? If it’s just compute, no big deal. As long as you can write a C compiler for your architecture (or use RISC-V as other folks have mentioned) getting the Linux kernel running shouldn’t be too hard. However, you’re going to have to deal with manually modifying the firmware of any peripherals you want to run – PCIe devices, USB, I2C, etc. Not a firmware engineer, so I have no idea how hard it would be, but this is one of the things that’s been holding back Linux on Arm over the years.
All in all, depending on how strict you want to be, it could be anywhere between slightly difficult and effectively impossible.


Given the state of that trailer, its a shame it’s anywhere. We hate to see an IP being milked so hard.


Yes, I’m also surprised it’s so low, if only because during sales you can get like 3-5 older indie games for $15. Those games are often shorter and have more controlled scope as well, meaning more folks would actually have time to play them.
On the other hand, it means folks are only buying games they’ll actually play, which is good.


Okular is the way to go for anything that’s typed, it has a lot more capabilities than Evince. For handwriting, I’ve used Inkscape, and Libreoffice Draw. They’re roughly similar in capabilities.
I had no idea! It seems like it was a really unruly project to manage, but it’s a shame to lose the centralization of having one app that can configure anything. I don’t see any problem in having package management split off into Myrlyn, but it sounds like Cockpit is much more limited in scope, which is a shame, since handling the edge cases gracefully was what made YaST so useful.
Here’s a source for others who didn’t realize.
Agreed. Fortunately, I don’t see anything about that being planned, they are just separating system installation from system management. I’m fine with that, as long as the new installer keeps the good control over partition management.
My reading is that the installer is no longer based on YaST, not that YaST has been retired overall.


I mean isn’t this just Xen revisited? I don’t understand why this is necessary.
Should be in testing within a day or two, might take a week or more to make it to stable.
edit: this is wrong (sorry!), see replies


Some motherboards explicitly enable wake on LAN as a BIOS option. If not in the BIOS it’s going to be a bit harder, but the software option recommended, (the Archlinux forum link) looks interesting.
Yup, he was eating sodium bromide instead of sodium chloride. Any significant amount of bromide is not good for ya.
Depends on how much power is being transmitted to each base station, but it would have to be a colossal satellite to be “we’re all going to die”.
I pointed that out mostly as a limitation on how much power could be transmitted to each base station.
Microwave scattering is an absolute nightmare over that kind of distance. Even for much shorter distances, microwaves are only practical to transport over a couple of meters in a waveguide.
If its transmitting to a base station, we can assume it’s in geosynchronous orbit, or about 22,000 miles from the surface. With a fairly large dish on the satellite, you could probably keep the beam fairly tight until it hit the atmosphere, but that last ~100 miles of air would scatter it like no tomorrow. Clouds and humidity are also a huge problem – water is an exceptionally good absorber in most of the MW band.
I saw numbers reported for the transmission efficiency somewhere (will update this if I can find it again), and they were sub-30%. The other 70% is either boiling clouds on its way down, or missing the reviever on the ground and gently cooking the surrounding area.
Oops, that’s right!