this post was submitted on 25 Apr 2024
53 points (100.0% liked)

Technology

37666 readers
238 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

I really want to use AI like llama, ChatGTP, midjourney etc. for something productive. But over the last year the only thing I found use for it was to propose places to go as a family on our Hokaido Japan journey. There were great proposals for places to go.

But perhaps you guys have some great use cases for AI in your life?

you are viewing a single comment's thread
view the rest of the comments
[–] Interstellar_1@pawb.social 3 points 6 months ago (1 children)

I use it to see the answers to problems on my physics homework when I can't figure it or myself. It works far better than forums, which are mostly all paywalled these days.

[–] driving_crooner@lemmy.eco.br 1 points 5 months ago (1 children)

If you are using ChatGPT for academic purposes, start your prompt with "pretend you are an expert professor on {subject} helping me understand {topic}"

[–] DdCno1@beehaw.org 4 points 5 months ago (1 children)

Both of you need to read up on the phenomenon called hallucination.

[–] blindsight@beehaw.org 1 points 5 months ago

LLMs can be great for explaining things that have concrete solutions, like physics and math problems, when they have a separate "computations" AI bolted onto it, like ChatGPT does. Usually, you can check the answer in the back of the book anyway, so it's very easy to catch fact hallucinations.

I wouldn't worry about source hallucinations with this either. I don't think it would even come up?