I think this, like anything tech, depends on the usecase.
As an example, if I was an ethnographer working to document rural cooking techniques on the Isle of Skye I might work with this group to stand up a public instance of mealie. Success would depend on the project being a collective work though. Me working with the collectives to meet the challenges of the project over a loosely defined set of time.
I think the above could be a big success. On the opposite side, I would not count on the collective to host and maintain my personal tech stack. Maybe I’d pay them to advise, but little more
I agree how these conclusions were developed is trash; however, there is real value to understanding the impact alignments have on a model.
There is a reason public llms don’t disclose how to make illegal or patented drugs. Llms shy away from difficult topics like genocide, etc.
It isnt by accident, they were aligned by corps to respect certain views of reality. All the llm does is barf out a statically viable response to a prompt. If they are weighted you deserve to know how