if you browse the play store on a web browser you can open app links directly to bypass this. for now.
I exist or something probably
if you browse the play store on a web browser you can open app links directly to bypass this. for now.
problem: actual mental help has low availability
solution: ai can stand in where needed
outcome: ai mental health systemically expands while actual therapists remain inaccessible, as insurance refuses to cover them. mental health outcomes systemically worsen across the board.
Hm, that is unexpected. obviously that doesnt include the full manufacturing carbon cost of a rocket but it’s probably close enough anyway.
not even a little, but matrioshka brains are cool
i highly doubt 10 years is even remotely close to breaking even on a rocket launch.
many techbros think this unironically and do hire preferentially asian male employees. google famously got in trouble for this.
If everyone gave up on a place when futures there look bleak, there wouldnt be a place left in the world worth living in.
warm summer, and bad food prices. the regime expects protests to be historic too.
have you considered for a moment, i know it’s crazy, that two authoritarian states historically gnashing at the bit to go to war might be doing this of their own volition?
what a strangely passive aggressive and rude response. if you want a comment written in your voice and chosen thoughts, you are free to do so.
This is (deploying malware and backdoors outside of wartime, often widely) criticisized very often and rightfully so. By both cybersecurity people and various political leanings, especially leftists.
Your analogy is good. These things are often intended to kill, and are often countervalue (read: target civilians). It is in fact bad no matter what state does it. It however should also come as no surprise that all states variously want to, though for example the usa has historically gone back and forth on how selective they are for many of the reasons you state. Though other reasons include things like not revealing exact capabilities by releasing malware ahead of time to be spotted and studied.
if you put the people making translation possible out of work, you will run out of sources for useful translations.
LLM are not magic. They function off of human effort for thir training data. High quality data is thus, sourced from (in this case) human translators. Some can be done without them by nonprofessional texts, but it is not enough.
they cant actually but it’s convincing enough that you’ll think it’s the same, and in the process make it financially impossible for improvements to be made by actual translators.
Yes we agree on the first part.
I will again direct you here re: the second.
Where is the world model you maintain? Can you point to it? You can’t - because the human mind is very much a black box just the same way as LLM’s are.
It’s not really factually correct if you want to get pedantic, both brains and llms are called black boxes for different reasons, but this is ultimately irrelevant. Your motive may be here or there, the rhetorical effect is the same. You are arguing very specifically that we cant know llm’s dont hae similar features (world model) to human brains because “both are black boxes”, which is wrong for a few reasons, but also plainly an equivalence. It’s rude to pretend everyone in the conversation is as illiterate as wed need to be to not understand this point.
Where is the world model you maintain? Can you point to it? You can’t - because the human mind is very much a black box just the same way as LLM’s are.
something being a black box is not even slightly notable a feature of relation, it’s a statement about model detail; the only reason you’d make this comparison is if you want the human brain to seem equivalent to llm.
for example, you didnt make the claim: “The inner workings of Europa are very much a black box, just the same way as LLM’s are”
Not understanding the brain (note: said world model idea is something of a fabrication by the ai people, brains are distributed functional structures with many parts and roles) is not an equality with “ai” make. brains and llm do not function in the same way, this is a lie peddled by hype dealers.
depends entirely on the kind of drive.
This is generally in line with ice, the drivetrain efficiencies anymore are in the high 90%s (applies to ev too), so from engine out you are losing basically everything to drag.
the most useful takeaways are the current statistics, predictions about ai spreading because of demand are largely unfounded anymore. but even at current hype bubble usage it’s an absurd amount of energy.