Nah it’s fine as long as it’s all upfront. Personally i don’t mind doing a couple of simple quick chores -which i would like to be sure of before renting, and i know that some people don’t want to do any chores -which you should be allowed to filter out those listing. Everyone wins except the scamy userous airbnb hosts.
LanternEverywhere
- 0 Posts
- 38 Comments
They need a rule making the chore list part of the listing.
LanternEverywhere@kbin.socialto
Selfhosted@lemmy.world•What storage manufacturer do you swear to and why?
2·2 years agoYup, knock on wood, I’ve had lots of Seagate drives over the decades and I’ve never had any of them go bad. I’ve had two WD drives and they both failed
LanternEverywhere@kbin.socialto
Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)
1·2 years agoI just asked chatgpt4 that exact question copy and pasted, and here is its response:
There was no president of the United States in 3500 BC. The United States of America did not exist as a country until 1776, when the Declaration of Independence was signed. ¹ The first president of the United States was George Washington, who took office in 1789. ² Before that, the territory that is now the United States was inhabited by various Native American tribes and colonized by European powers such as Spain, France, and Great Britain. ³¹: Declaration of Independence
²: List of presidents of the United States - Wikipedia
³: [History of the United States - Wikipedia].Source: Conversation with Bing, 2/27/2024
(1) List of presidents of the United States - Wikipedia. https://en.wikipedia.org/wiki/List_of_Presidents_of_the_United_States.
(2) List of presidents of the United States - Simple English Wikipedia, the … https://simple.wikipedia.org/wiki/List_of_Presidents_of_the_United_States.
(3) President of the United States - Wikipedia. https://en.wikipedia.org/wiki/President_of_the_United_States.
(4) List of presidents of the United States - Wikipedia. https://en.wikipedia.org/wiki/List_of_Presidents_of_the_United_States.
(5) List of presidents of the United States - Wikipedia. https://en.wikipedia.org/wiki/List_of_Presidents_of_the_United_States.
(6) List of U.S. Presidents in Chronological Order - HistoryNet. https://www.historynet.com/us-presidents/.
(7) Getty Images. https://www.gettyimages.com/detail/photo/seal-of-the-president-of-the-united-states-royalty-free-image/1084903580.
LanternEverywhere@kbin.socialto
Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)
1·2 years agoAre you using 4? Because it’s much better than the earlier versions
LanternEverywhere@kbin.socialto
Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)
1·2 years agoThat’s a fun story, but isn’t applicable to the topic here. That could very easily be verified as true or false by a secondary system. In fact you can just ask Wolfram Alpha. Ask it what are the odds that any two people share the same birthday. I just asked it that exact question and it replied 1/365
EDIT
in fact I just asked that exact same question to chatgpt4 and it also replied 1/365
LanternEverywhere@kbin.socialto
Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)
1·2 years agoThere are already existing multiple different LLMs that are essentially completely different. In fact this is one of the major problems with LLMs, because when you add even a small amount of change into an LLM it turns out to radically alter the output it returns for huge amounts of seemingly unrelated topics.
For your other point, I never said bouncing their answers back and forth for verification was trivial, but it’s definitely doable.
LanternEverywhere@kbin.socialto
Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)
1·2 years agoThat’s not a problem at all, I already use prompts that allow the LLM to say they don’t know an answer, and it does take that option when it’s unable to find a correct answer. For instance I often phrase questions like this “Is it known whether or not red is a color in the rainbow?” And for questions where it doesn’t know the answer it now will tell you it doesn’t know.
And to your other point, the systems may not be capable of discerning their own hallucinations, but a totally separate LLM will be able to do so pretty easily.
LanternEverywhere@kbin.socialto
Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)
2·2 years agoGive an example of a statement that you think couldn’t be verified
LanternEverywhere@kbin.socialto
Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)
2·2 years agoNo, I’ve used LLMs to do exactly this, and it works. You prompt it with a statement and ask “is this true, yes or no?” It will reply with a yes or no, and it’s almost always correct. Do this verification through multiple different LLMs and it would eliminate close to 100% of hallucinations.
EDIT
I just tested it multiple times in chatgpt4, and it got every true/false answer correct.
LanternEverywhere@kbin.socialto
Technology@beehaw.org•Hallucination is Inevitable: An Innate Limitation of Large Language Models (arxiv preprint)
3·2 years agoI extremely doubt that hallucination is a limitation in final output. It may be an inevitable part of the process, but it’s almost definitely a surmountable problem.
Just off the top of my head I can imagine using two separate LLMs for a final output, the first one generates an initial output, and the second one verifies whether what it says is accurate. The chance of two totally independent LLMs having the same hallucination is probably very low. And you can add as many additional separate LLMs for re-verification as you like. The chance of a hallucination making it through multiple LLM verifications probably gets close to zero.
While this would greatly multiply the resources required, it’s just a simple example showing that hallucinations are not inevitable in final output
LanternEverywhere@kbin.socialto
Technology@lemmy.ml•Europe’s deepest mine to become giant gravity battery
11·2 years agoWhere, quote it for me. I looked in 5 separate dictionaries, they all say it’s a negative thing.
LanternEverywhere@kbin.socialto
Technology@lemmy.ml•Europe’s deepest mine to become giant gravity battery
52·2 years agoHuh, definitionally it’s always a bad thing, i wonder why people around you use it that way
LanternEverywhere@kbin.socialto
Technology@lemmy.ml•Europe’s deepest mine to become giant gravity battery
8·2 years agoSounds like a double win, not a double whammy
LanternEverywhere@kbin.socialto
Technology@lemmy.ml•Europe’s deepest mine to become giant gravity battery
81·2 years agoThat’s similar but different in a lot of meaningful ways. Hydro pumping like that requires a relatively large body of water next to a large geographical height right nearby. This new system doesn’t require any water, and it uses a man made hole in the ground that’s already been created and which otherwise would be simply unused
LanternEverywhere@kbin.socialto
Technology@lemmy.ml•Europe’s deepest mine to become giant gravity battery
262·2 years agoThat’s so cool!
LanternEverywhere@kbin.socialto
Technology@beehaw.org•Only 150+ apps have been designed specifically for Apple's Vision Pro, so far | TechCrunch
9·2 years agoGet a quest, you can stream your videos to a huge virtual screen for literally 10% of the price of an apple vision
LanternEverywhere@kbin.socialto
Technology@beehaw.org•Only 150+ apps have been designed specifically for Apple's Vision Pro, so far | TechCrunch
15·2 years agoApple vision will be a very good product …in a few years, after it’s much cheaper and more capable. But as of today, you can get an oculus quest which does a large percent of the same stuff for literally 10% of the price


The whole topic of drugs could easily be covered in 30 minutes. The only thing people under 18 need to know is this:
There are a large variety of different recreational drugs, each of which make you feel a different way, and which come with their own set of different risks and benefits
At some point when you’re older it may be reasonable for you to try some particular drugs, but there are some drugs which are never safe for anyone at any age
No drugs are safe for you to do yet. Your brain is still in a developing phase, and drugs that might be safe for you to do later will be very harmful to you at this age. Even though taking a drug might make you feel good in the very short term moment, it very likely could make your growing brain become depressed as soon as you come down from the drug, and this can become intense sadness that you feel for the rest of your life.
So for now just know that drugs is a complex topic that you can learn more about later when you’re older, but for now the details don’t matter because all drugs will be harmful to you right now while your brain is still growing