Is there something like SETI for training open source models?

Reddit r/LocalLLaMA / 4/18/2026

💬 OpinionDeveloper Stack & InfrastructureSignals & Early TrendsIdeas & Deep Analysis

Key Points

  • The post asks whether a distributed-computing network similar to SETI or Folding@home could be created for training open-source machine learning models.
  • It suggests the network would use non-problematic licenses and potentially train models beyond just “open weights,” implying broader openness around data and training workflows.
  • The question is framed as a feasibility and reasonableness challenge, asking whether the idea is “stupid” or practical.
  • The discussion is positioned as community-driven, highlighting the desire to harness distributed volunteer compute for AI training rather than centralized resources.

Some years ago there were initiatives for distributed computing networks like SETI or even Fold at Home for proteins I think.

Would it be possible that the community could make a network like this for training open source models with non-problematic licenses and more than only open weights? Is that a stupid idea?

submitted by /u/Cherlokoms
[link] [comments]