If they had said “locally hosted in our datacenter”
Then that would also be an oxymoron.
Local is the opposite of remote. This is a remote server. Remote servers are not local. This is not a matter of interpretation.
If they had said “locally hosted in our datacenter”
Then that would also be an oxymoron.
Local is the opposite of remote. This is a remote server. Remote servers are not local. This is not a matter of interpretation.
Why does local mean local? I’m not sure I understand your question.
“Locally hosted” means it’s running on the local host. In this case, that would mean on the same computer running Firefox.
Calling something that is only accessible over the internet “locally hosted” is outrageous doublespeak.
Orbit currently uses a version of Mistral LLM (Mistral 7B) that is locally hosted on Mozilla’s Google Cloud Platform instance.
Hmm.
>locally hosted
>Google Cloud
Hmmmmmmmmmmmmmmmm.
So probably there will be some systems other than Linux that do use Rust
There’s one called Redox that is entirely written in Rust. Still in fairly early stages, though. https://www.redox-os.org/
Installing Linux after Windows should be fine without disconnecting drives.
The reverse is troublesome. Microsoft’s installer is all too happy to shit on your drives, even the ones you’re not using for installation. But Linux installers are much more friendly to dual-booting and all kinds of complex setups.
Copyleft licenses require that derivative works maintain that same license. Not all the Creative Commons licenses are copyleft. CC-BY only requires that attribution be given in the derivative work. That’s the one YouTube offers as an option for uploads.
If you want your video to be copylefted, you should use one of the SA (share-alike) licenses. See https://creativecommons.org/share-your-work/cclicenses/
I explained why that data does not contradict what the previous commenter was trying to say.
Speed is less of a factor than endurance in a persistence-hunting scenario where we’re much slower than our prey anyway.
I don’t know the facts for this specific claim, but the logic is fair. One group can be better suited for endurance without being faster. One group could also be faster on average without having the individual fastest performers. Not only because of cultural factors, but also because the distribution curves might have different shapes for men vs women. There could be greater outliers (top performers) among men even if the average is higher among women in general. It’s not necessarily as straightforward as, say, height, where men’s distribution curve is almost the same shape as women’s, just shifted up a few inches.
I don’t have the data to draw any real conclusions, though.
One of the problems looking at athletic records is that it’s really just the elite among a self-selected group of enthusiasts, which doesn’t tell us a whole lot about what might have been the norm 100,000 years ago, or what might be the norm today if all else were equal between genders. These are not controlled trials.
I’ve read that the top women outperform the top men in long-distance open-water swimming, supposedly due in part to higher body fat making women more buoyant, helping to regulate body temperature, and providing fuel. This is the first time I’ve read that women might have an advantage in running, though.
I wish the article provided citations. The reality is probably too complex to fit into a headline or pop-sci writeup.
What part of this is misinformation, exactly? Seems pretty well-supported.
I keep seeing this claim, but never with any independent verification or technical explanation.
What exactly is listening to you? How? When?
Android and iOS both make it visible to the user when an app accesses the microphone, and they require that the user grant microphone permission to the app. It’s not supposed to be possible for apps to surreptitiously record you. This would require exploiting an unpatched security vulnerability and would surely violate the App Store and Play Store policies.
If you can prove this is happening, then please do so. Both Apple and Google have a vested interest in stopping this; they do not want their competitors to have this data, and they would be happy to smack down a clear violation of policy.
I agree completely.
I understand the motivation here — apps that lack location permission shouldn’t be able to get backdoor access to your location via your camera roll. That makes sense, because you know damn well every spyware social media company would be doing that if they could.
But the reverse is also true: apps that legitimately need to read photos and access all their metadata shouldn’t need to be granted full location access.
Yeah, I had to disconnect all my SATA HDs to stop the Windows installer from shitting all over them.
I’d be worried about Windows updates doing the same thing now, after the the recent glitch that broke bootloaders.
It was bought out and cleaned up a few years ago. It’s legit again now, though I don’t think it’ll ever really recover from that fiasco.
Chromium itself will. Other Chromium-based browser vendors have confirmed that they will maintain v2 support for as long as they can. So perhaps try something like Vivaldi. I haven’t tried PWAs in Vivaldi myself, but it supports them according to the docs.
Debian still supports Pentium IIs. They axed support for the i586 architecture (original Pentium) a few years back, but Debian 12 (current stable, AKA Bookworm) still supports i686 chips like the P2.
Not sure how the rest of the hardware in that Compaq will work.
See: https://www.debian.org/releases/stable/i386/ch02s01.en.html
Probably ~15TB through file-level syncing tools (rsync or similar; I forget exactly what I used), just copying my internal RAID array to an external HDD. I’ve done this a few times, either for backup purposes or to prepare to reformat my array. I originally used ZFS on the array, but converted it to something with built-in kernel support a while back because it got troublesome when switching distros. Might switch it to bcachefs at some point.
With dd specifically, maybe 1TB? I’ve used it to temporarily back up my boot drive on occasion, on the assumption that restoring my entire system that way would be simpler in case whatever I was planning blew up in my face. Fortunately never needed to restore it that way.
For sure. It’ll never be enforced completely, but it gives teeth to go after some big offenders.
I don’t think there’s any way to count years without rooting it somewhere arbitrary. We cannot calculate the age of the planet, the sun, or the universe to the accuracy of a year (much less a second or nanosecond). We cannot define what “modern man” is to a meaningful level of accuracy, either, or pin down the age of historical artifacts.
Most computers use a system called “epoch time” or “UNIX time”, which counts the seconds from January 1, 1970. Converting this into a human-friendly date representation is surprisingly non-trivial, since the human timekeeping systems in common use are messy and not rooted in hard math or in the scientific definition of a second, which was only standardized in 1967.
Tom Scott has an amusing video about this: https://www.youtube.com/watch?v=-5wpm-gesOY
There is also International Atomic Time, which, like Unix Time, counts seconds from an arbitrary date that aligns with the Gregorian calendar. Atomic Time is rooted at the beginning of 1958.
ISO 8601 also aligns with the Gregorian calendar, but only as far back as 1582. The official standard does not allow expressing dates before that without explicit agreement of definitions by both parties. Go figure.
The core problem here is that a year, as defined by Earth’s revolution around the sun, is not consistent across broad time periods. The length of a day changes, as well. Humans all around the world have traditionally tracked time by looking at the sun and the moon, which simply do not give us the precision and consistency we need over long time periods. So it’s really difficult to make a system that is simple, logical, and also aligns with everyday usage going back centuries. And I don’t think it is possible to find any zero point that is truly meaningful and independent of wishy-washy human culture.
This is a FAQ for end users, about a feature in software running on end users’ computers.
It is absolutely doublespeak to call it “local”. Are we supposed to invent an entirely new term now to distinguish between remote and local? Please do not accept this usage. It will make meaningful communication much harder.
Edit: I mean seriously, by this token OpenAI, Google, Facebook, etc. could call their servers “locally hosted”. It is an utterly meaningless term if you accept this usage.