• 1 Post
  • 16 Comments
Joined 1 year ago
cake
Cake day: July 5th, 2023

help-circle

  • As a federated learning researcher, I love to see articles introducing the public to the idea. But this article is really drawing a comparison between the fediverse and federated learning that doesn’t make sense (to me).

    Beyond the fact that data and compute are stored on separate servers, they really aren’t similar. Federated learning avoids sharing data by sharing gradients or model updates with a central aggregator; raw data does not leave your device. The fediverse enables easy sharing of data between servers and avoids a central server.

    Additionally, this article makes it seem like medical researchers were inspired by the fediverse, but the FedAvg paper was released in 2016—two years before ActivityPub was introduced in 2018.
















  • I agree. Also, there is a ton of really great programming and AI/ML content on Twitter. I hope much of it migrates to Mastodon and Lemmy, and I’m starting to see some of it appear on Threads as well. Instead of knee-jerk reactions to defederate from Threads to avoid the the embrace-extend-extinguish that happened to XMPP, we should build technical safeguards to keep things open.

    Although Meta’s social media and privacy woes are well-known, Meta is one of the largest contributors to open-source and AI/ML. We should be encouraging companies to federate and building safeguards into ActivityPub to discourage “embrace-extend-extinguish”. We’re already seeing fracturing in the Fedverse between Pleroma and Akkoma, Beehaw defederating from lemmy.world and sh.itjust.works, and most Lemmy mobile clients not supporting Kbin. I want the fediverse to succeed, but this is driving users away—what’s the point if this ends as a ghost town? Do we want to go the way of Identi.ca and previous attempts