

It’s kind of mind blowing how dismissive of the ACA people are, even those who were aware before it went into effect. It wasn’t by any means what it should have been, but medical access unequivocally improved vastly as a result of it.
It’s kind of mind blowing how dismissive of the ACA people are, even those who were aware before it went into effect. It wasn’t by any means what it should have been, but medical access unequivocally improved vastly as a result of it.
Diffusion models iteratively convert noise across a space into forms and that’s what they are trained to do. In contrast to, say, a GPT that basically performs a recursive token prediction in sequence. They’re just totally different models, both in structure and mode of operation. Diffusion models are actually pretty incredible imo and I think we’re just beginning to scratch the surface of their power. A very fundamental part of most modes of cognition is converting the noise of unstructured multimodal signal data into something with form and intention, so being able to do this with a model, even if only in very very narrow domains right now, is a pretty massive leap forward.
A quick search turns up that alpha fold 3, what they are using for this, is a diffusion architecture, not a transformer. It works more the image generators than the GPT text generators. It isn’t really the same as “the LLMs”.
It’s interesting to me that NYC, Jewish, and gay/lesbian all had the same wildly incorrect estimate on average.
It won’t tell us what to do, it’ll do the very complex thing we ask it to. The biggest issues facing our species and planet atm all boil down to highly complex logistics. We produce enough food to make everyone in the world fat. There is sufficient shelter and housing to make everyone safe and secure from the elements. We know how to generate electricity and even distribute it securely without destroying the global climate systems. What we seem unable to do is allocate, transport, and prioritize resources to effectively execute on these things. Because they present very challenging logistical problems. The various disciplines underpinning AI dev, however, from ML to network sciences to resource allocation algorithms making your computer work, all are very well suited to solving logistics problems/building systems that do so. I really don’t see a sustainable future where “AI” is not fundamental to the logistics operations supporting it.
This is fucked up, but also it’s fucked up that there is a single public policy ducked director for Israel AND the Jewish diaspora.
I imagine not, though I haven’t looked into it.
God, why is the games industry so fucking illiterate when it comes to IP law. File a trademark opposition? They’re suing! File a patent application without issued claims or even substantive examination? They’ve patented it! These aren’t crazy fucking complicated concepts, but the journalism for games industries like actively stunts the understanding of these things by the market.
There are many open sourced locally executable free generative models available.
You are agreeing with the post you responded to. This ruling is only about training a model on legally obtained training data. It does not say it is ok to pirate works–if you pirate a work, no matter what you do with the infringing copy you’ve made, you’ve committed copyright infringement. It does not talk about model outputs, which is a very nuanced issue and likely to fall along similar analyses as music copyright imo. It only talks about whether training a model is intrinsically an infringement of copyright. And it isn’t because anything else is insane and be functionally impossible to differentiate from learning a writing technique by reading a book you bought from an author. Even a model that has overfit training data, it is in no way recognizable to any particular training datum. It’s hyperdimensioned matrix of numbers defining relationships between features and relationships between relationships.
I agree. I’m generally pretty indifferent to this new generation of consumer models–the worst thing about it is the incredible amount of idiots flooding social media witch hunting it or evangelizing it without any understanding of either the tech or the law they’re talking about–but the people who use it so frequently for so many fundamental things that it’s observably diminishing their basic competencies and health is really unsettling.