…cogito, ergo sum…

  • 0 Posts
  • 6 Comments
Joined 1 month ago
cake
Cake day: December 3rd, 2025

help-circle

  • Artwork@lemmy.worldtoSelfhosted@lemmy.worldSelfhosted coding assistant?
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    4
    ·
    edit-2
    14 hours ago

    No, thank you. Sorry, never.

    Not only that, but the huge probability of mistakes is just deafening. The last time I used LLM was in 2023 someone recommended for a task at paper work, and I got a literal headache in 10 minutes… Since then I never ever will use that sorrow for anything that is not for blackbox pentesting or experimental unverified data generated you may find in medicine or military isolated solutions.

    That deafening feel that every single bit of output from that LLM or void machine may contain a mistake no soul is accountable for to ask about… A generated bit of someone’s work you just cannot verify since no source nor human is available… How would you trace the rationale that resulted in the output shown?

    Faster? Is that so… Doesn’t verification of every output require even more time to test it and consider stable, to prove it is correct, to stay accountable for the knowledge and actions you perform as a developer, artist, researcher… human?

    Your mind is to be trained to do a research, remember, and do not depend on someone’s service to a level of predominance/replacement.
    Meanwhile, effort, passion, creativity, empathy, and love, in turn, you carry, supports in long-term.

    You may not care now, though, but you do you. It’s your mind and memory you develop.





  • Thank you, but I do disagree. You cannot know the “result” of that LLM does include all the required context, and you won’t re-clarify it, since the output does already not contain the relevant, and in the end you miss the knowledge and waste the time, too.

    How are you sure the output does include the relevant? Will you ever re-submit the question to an algorithm, without even knowing it is required re-submit it, since there’s even no indication for it? I.e. The LLM just did not include what you needed, did not include also important context surrounding it, and did not even tell you the authors to question further - no attribution, no accountability, no sense, sorry.