Yall are frickin heroes
Deflock.me is absolutely depressing to see the amount of surveillance present.
Would be great to have a navigation app that could map routes that avoid or minimize the number of surveillance points.
Currently I just try to be diligent about marking any police or ice activity in waze but a FOSS option would be great.
We already have the openmaps project. Perhaps someone could make a CoMaps plugin or add-on? Might be neat.
https://dnspmap.com/ tries to do that but its a little difficult to use
That’s a cool project thank you. Will check this out further
You can see Benn Jordan’s videos (referenced in the article) here: https://peertube.gravitywell.xyz/w/5xhkuDuVsWZ2jbsVw32Una
This man is a gift to all in these times.
Spread his word.
Learn his craft.
Build those tools.
Oooh, he publishes on peertube? That’s really cool
Sorry, it’s off topic, but how do I subscribe to this guy from the instance I’ve signed up with? I searched his channel but nothing showed up. So that does that mean I’m SooL because my instance isn’t federated with his?
Might not apply to Lemmy, but I’m pretty sure PieFed users can follow from !benjordan@peertube.gravitywell.xyz
I wish I learned how to hack
The online communities are typically great. If you get really stuck, LLMs can be nice for dealing with your specific confusion.
Edit: … but it’s better to ask the community so others can benefit from the answer.
Please no. Absolutely not. LLM is absolutely not “nice for dealing with confusion” but the very opposite.
Please do consider people effort, articles, attributions, and actually learning and organizing your knowledge. Please do train your mind, and self-confidence.You can’t rely on LLMs to get actual answers for technical things but it can help avoid a huge amount of wasted time and effort, back-and-forth, going in circles, talking around or past the issues etc. that is seen in threads everywhere in these types of expert niche communities. Besides, maybe my question has already been answered.
When I don’t know the specific terms or framing, am missing context or am trying to get from A to C, but have no idea that B even exists, nevermind how (or who) to ask about it. If I can accelerate the process of clearing that up, I can go to the correct human expert or community with a much better handle on what it is I’m actually looking for and how to ask for it.
Thank you, but I do disagree. You cannot know the “result” of that LLM does include all the required context, and you won’t re-clarify it, since the output does already not contain the relevant, and in the end you miss the knowledge and waste the time, too.
How are you sure the output does include the relevant? Will you ever re-submit the question to an algorithm, without even knowing it is required re-submit it, since there’s even no indication for it? I.e. The LLM just did not include what you needed, did not include also important context surrounding it, and did not even tell you the authors to question further - no attribution, no accountability, no sense, sorry.
I’m not sure we disagree. I agree that LLMs are not a good source for raw knowledge, and it’s definitely foolish to use them as if they’re some sort of oracle. I already mentioned that they are not good at providing answers, especially in a technical context.
What they are good at is gathering sources and recontextualizing your queries based on those sources, so that you can pose your query to human experts in a way that will make more sense to them.
You’re of course in your absolute right to avoid the tech entirely, as it comes with many pitfalls. Many of these models are damn good at gathering info from real human sources, though, if you can be concise with your prompts and avoid the temptation of swallowing its “analysis”.








