

I remember the days when bugs in x86 CPUs were almost unheard of. The Pentium FDIV bug and the F00F bug were considered these unicorn things.
I remember the days when bugs in x86 CPUs were almost unheard of. The Pentium FDIV bug and the F00F bug were considered these unicorn things.
Distributed computing would eliminate the water usage, since the heat output wouldn’t be so highly concentrated, but it would probably somewhat increase power consumption.
In an ideal world I think data center waste heat would be captured for use in a district thermal grid / seasonal thermal energy store like the one in Vantaa.
Of course that isn’t to say that we shouldn’t be thinking about whether we’re using software efficiently and for good reasons. Plenty of computations that take place in datacenters serve to make a company money but don’t actually make anyone’s lives better.
Goddamn
I previously kinda liked Hertzog (he was amusing to listen to if nothing else), not any more.
How do you boil 11,000 rats alive and then go on and make 10 minute long thinkpieces about the profound sadness of the death of a single penguin that leaves its flock? What a fucking masturbatory asshat.
Phones can figure out its location using gyroscopes and accelerometer
This is plainly false.
The error stack-up from the imprecision of a phone’s MEMS sensors would make positioning basically impossible after a couple of dozen feet, let alone after hours of walking around.
There are experimental inertial navigation systems that can do what you describe, but they use ultra sensitive magnetometers to detect tiny changes in the behavior of laser suspended ultra cold gas clouds that are only a few hundred atoms large. That is not inside your phone.
What is the “everything” that Rust is being used in? From what I’ve heard its being used in the same place you’d use C or C++, not in any other niches.
I’m sure it makes the bean counters happier to have another asset valued at X amount, but in practice the software will just be locked in some vault where it won’t do anyone any good.
Its an instance where the number on the screen doesn’t actually correspond to any useful economic activity.
For awhile now I’ve been thinking about how nice it would be to have a something like a modern version of the Poqet PC.
The Poqet PC had a much nicer keyboard than the laptop in the article, and between the simplicity of its software and a very aggressive power management strategy (it actually paused the CPU between keystrokes) it could last for weeks to months on two AA batteries.
Imagine a modern device with the same design sensibilities. Instead of an LCD screen you could use e-ink. For both power efficiency, and because the e-ink wouldn’t be well suited to full motion video, the user interface could be text/keyboard based (though you could still have it display static images). Instead of the 8088 CPU you could use something like an ARM Cortex M0+, which would give you roughly the same amount of power as a 486 for less than 1/100th the wattage of the 8088. Instead of the AAs you could use sodium ion or lithium titanate cells for their wide temperature range and high cycle life (and although these chemistries have a lower energy density than lithium ion, they’d probably still give you more capacity than the AAs, especially if you used prismatic cells). With such a miniscule power consumption you could keep a device like that charged with a solar panel built into the case.
Such a device would have very little computing power compared to even a smartphone, but it could still be useful for a lot of things. Besides things like text editors or spreadsheets, you could replicate the functionality of the Wiki Reader and the Cybiko (imagine something like the Cybiko with LoRaWAN). You could maybe even keep a copy of Open Street Map on there, though I don’t know how computationally expensive parsing its data format and displaying a map segment is.
It is from 2018, but how do you imagine that this was written by AI given that LLMs barely existed at the time and weren’t accessible by the general public?
Yeah, I’m not an expert in construction but I don’t really know what this buys you vs using, for example, insulating concrete forms.
It might be less the quality of the research and more this:
(This comic is a bit outdated nowadays, but you get the idea).
Except the headlines say “scientists report discovery of miraculous new battery technology using A!”.
Also i think people don’t realize how long it takes to commercialize battery technology. I think they put them in the same mental category as computers and other electronics, where a company announces something and then its out that same year. The first lithium ion batteries were made in a lab in the 1970s. A person in 2000 could have said “I’ve been hearing about lithium ion batteries for decades now and they’ve never amounted to anything”, and they would be right, but its not because its a bunk technology or the researchers were quacks.
With electric cars you might not even need a special charger so much as a special charging cycle. Its already the norm for cars to tell the charger what voltage and current they want, and its already the norm for cars to carefully control their battery’s temperature during charging.
That’s not to say you’d necessarily be able to do this with just a software update, but its not too far off from the current paradigm.
IDK, it wouldn’t be the first time a news org published some random shit as fact because they’re too eager to be the first to report on something.
That sounds absolutely fine to me.
Compared to an NVME SSD, which is what I have my OS and software installed on, every spinning disk drive is glacially slow. So it really doesn’t make much of a difference if my archive drive is a little bit slower at random R/W than it otherwise would be.
In fact I wish tape drives weren’t so expensive because I’m pretty sure I’d rather have one of those.
If you need high R/W performance and huge capacity at the same time (like for editing gigantic high resolution videos) you probably want some kind of RAID array.
This is an idea from the 1960s back when they thought solar panels would be like computer chips and remain super expensive in terms of area but become exponentially better at the amount of sunlight they could convert into electricity.
It makes absolutely zero sense to spend billions of dollars putting solar panels in space and beaming the power back to earth now that they are so cheap per unit area. The one thing you could argue a space based solar array could do would be to stretch out the day length so you need less storage, but that’s easier to accomplish using long electrical cables.