sinedpick

joined 1 year ago
[–] sinedpick@awful.systems 9 points 3 months ago

that sounds like a super pleasant and stable molecule

[–] sinedpick@awful.systems 7 points 3 months ago

Idealism? check

False dichotomy between said idealism and cherry picked reality? check

Mistakes avoided by anyone with a mental age over 16.

[–] sinedpick@awful.systems 6 points 3 months ago* (last edited 3 months ago) (5 children)

huh? public school is free

[–] sinedpick@awful.systems 8 points 3 months ago (1 children)

How did they respond to the counterargument that humans are simply... built different?

[–] sinedpick@awful.systems 6 points 3 months ago* (last edited 3 months ago) (3 children)

Thanks for the suggestions. The LLM is free to use (for now) so I thought I'd poke it and see how much I should actually be paying attention to these things this time around.

Here are its answers. I can't figure out how to share chats from this god-awful garbage UI so you'll just have to trust me or try it yourself.

  1. It gives the correct but unnecessary answer: "If I were to ask you which door leads to freedom, which door would you point to?" It also mentions a lying guard but also acknowledges that it's absent from this specific problem.
  2. "A table or a chair"
  3. Completely fails on this one, it missed the sentence "Everyone knows the color of their eyes"
  4. Not sure what to do with this
  5. "While a Hadamard matrix of order 2672 might exist, its existence isn't immediately provable using the most common constructions" -- I won't pretend to know anything about the Hadamard conjecture if that's a real thing so I have no idea what it's on about here.

edit: I didn't do any prompt engineering, just straight copy paste.

[–] sinedpick@awful.systems 4 points 3 months ago (14 children)

I tried using Claude 3.5 sonnet and .... it's actually not bad. Can someone please come up with a simple logic puzzle that it abysmally fails on so I can feel better? It passed the "nonsense river challenge" and the "how many sisters does the brother have" tests, both of which fooled gpt4.

[–] sinedpick@awful.systems 8 points 4 months ago

In this case, the context is definitely humans being born on earth. The entire diatribe I responded to can be summed up as "People have all kinds of ethical and moral objections to surrogacy. In this post, I dismiss all of those without an argument, and instead assign positive moral value to everything that increases the number of lives, including surrogacy." It's probably one of the dumbest things I read this week.

[–] sinedpick@awful.systems 4 points 4 months ago

AHAHAHAHAAH they had fucking piddly little fans blowing. I hope that made the fire worse.

[–] sinedpick@awful.systems 6 points 4 months ago

It gets even worse when you add YC's claim that it doesn't "fund ideas" but rather "fund people." They didn't find Austen (a shit person) because he had a good idea (he didn't). They funded Austen (a shit person) because they liked him (a shit person).

[–] sinedpick@awful.systems 9 points 4 months ago* (last edited 4 months ago) (2 children)

Deep into that diatribe:

Some people's moral intuitions are that nonexistence is preferable to, or not obviously worse than, existence in a less-than-ideal setting. I wholly reject this intuition, and looking at the record of the persistence of life in the face of adversity, belong to a heritage of those who have, time and time again, rejected it. Life is Good.

What a disgustingly privileged thing to say. People have survived in shitty situations so therefore more children in poverty is axiomatically good? ~~This guy deserves poverty.~~ (edit: maybe that's a bit too far but I fucking hate this guy)

[–] sinedpick@awful.systems 9 points 4 months ago (1 children)

I think you're being too generous. What they wanted to say is "There are genetic traits associated with intelligence." However, not inserting probability distributions in every fucking sentence is a class 2 misdemeanor in Rat circles, hence what was written.

[–] sinedpick@awful.systems 5 points 4 months ago

This is going to happen in any prediction market whose events are opened and closed manually. Unless you can automatically halt trading when the condition is met (this is probably what people call an "AI-complete" problem), there will always be people who notice the condition is met before the event runner. The unfair trades will certainly be reversed if this is a prediction market worth even a little bit of its salt.

view more: ‹ prev next ›