Lord Protector Posted February 24 Posted February 24 (edited) preporuka za pogledati: multimodalna AI Edited February 24 by Lord Protector
Lord Protector Posted February 24 Posted February 24 (edited) dobar tutorial o attention mehanizmu od istog autora Edited February 24 by Lord Protector 1
Ras Posted March 1 Posted March 1 Lori Anderson ima izložbu I'll Be Your Mirror o AI generatoru teksta koji imitira stil Lu Rida: Lori Anderson: Potpuno sam "navučena" na veštačku inteligenciju i pokojnog supruga Lua Rida (nova.rs) Ovo na Nova portalu je priređen tekst iz Gardijana (ne mogu da prenesem direktan link): laurie anderson i'll be your mirror exhibition - Google претрага 1 1
Lord Protector Posted March 2 Posted March 2 (edited) trenutno stanje stvari, citat: Quote Najjači čip sa najviše jezgara i najviše memorije i najveće brzine je Cerebras WSE-2, koji ima 850.000 jezgara, 2600 milijardi tranzistora, nekih 40GB keša sa brzinom protoka od 20 petabita u sekundi. Samo jedan taj čip omogućava treniranje LLM do 120.000 milijardi parametara, a nude sisteme sa do maksimalno 192 ovakva čipa, s tim da je 16 dovoljno za većinu NLP modela. Postoji mogućnost iznajmljivanja servisa kod Cirrascale Cloud Services, na kome za oko 2500$ možete da istrenirate ChatGPT-3 za samo 10 sati, dok je za model sa 70 milijardi parametara potrebno čak 85 dana i koštalo bi oko 2.5 miliona dolara. Na žalost, treniranje velikih jezičkih modela za bilo kakvu realnu primenu sada je u rukama samo nekoliko kompanija i startapova, svi ostali su izbačeni iz igre i mogu samo da se igraju nekim manjim modelima tražeći malo brže algoritme ili kvalitetnije modele, koji će opet za realnu primenu moći da se treniraju samo kod ovih nekoliko kompanija i startapova. https://venturebeat.com/ai/cerebras-unveils-worlds-larges-ai-training-supercomputer-with-54m-cores/ Edited March 2 by Lord Protector 1
Lucia Posted March 2 Posted March 2 Ne razumem poslednji pasus. Ako je sve tako super brzo i jeftino sta sprecava navalu novih i postojecih kompanija, da ulete u igru? Sta sprecava tog ccloud services providera da sam gradi nas svojom superiornom platformom?
Lord Protector Posted March 2 Posted March 2 (edited) @Lucia ne znam gde je to rečeno Supercomputing je ekstremno skup sport, posebno cutting edge rešenja. Kod LLM veličina je presudan faktor. Novca, procesorske snage, parametara, kadrova... Meni je zanimljivo da su dostigli magičnu cifru od infrastrukturno mogućih 120 triliona parametara. (120 biliona ili 120.000 milijardi), na srpskom. Igramo se prvi put na nivou kompleksnosti ljudskog mozga, 100 triliona sinapsi, 86 milijardi neurona. 1.2 ×10^14 ??? Edited March 2 by Lord Protector
gone fishing Posted March 2 Posted March 2 koliko struje troši to čudo, sigurno mora da ima sopstvenu trafo stanicu , ako je potrošnja uporediva sa potrošnjom klasičnih cpu ispada da troši oko 3000 kW/h, odn oko 70000 kW/dan, što je potrošnja oko 5000 domaćinstava - najnoviji Quote supercomputer, Andromeda, which combines 16 WSE-2 chips into one cluster with 13.5 million AI-optimized cores, delivering up to 1 Exaflop of AI computing horsepower, or at least one quintillion (10 to the power of 18) operations per second.[34][35] The entire system consumes 500KW, which is a drastically lower amount than somewhat-comparable GPU-accelerated supercomputers.
Lucia Posted March 16 Posted March 16 On 1. 1. 2024. at 9:11, Lucia said: Andrej Karpathy (OpenAI) je poznat po YT hands-on serijama (From Zero to Hero, Let's Build GPT from scratch) koji su za svaku preporuku ali mozda ne bas zanimljivi za najsiru publiku bez osnovnog softverskog predznanja. Pre nekih mesec dana snimio je ovaj sjajan Intro to LLMs video - koji je bas za svakoga - pa cak i ako vam je skoro sve ovo poznato, nacin prezentacije i nesto od njegovog pogleda u buducnost vrede gledanja: @Lord Protector lajkovao si ovo pre par meseci na ovoj istoj temi
Lord Protector Posted March 16 Posted March 16 (edited) @Lucia Alchajmer LLM Edited March 16 by Lord Protector
Shan Jan Posted March 25 Posted March 25 Boga mi razvija se brze nego sto sam ocekivao... BTW, akcije nvidie koja proizvodi chipove za AI su porasle 20 puta od 2019
Lord Protector Posted May 8 Posted May 8 (edited) Quote Summary. The protein folding problem is the most important unsolved problem in structural biochemistry. The problem consists of three related puzzles: i) what is the physical folding code? ii) what is the folding mechanism? and iii) can we predict the 3D structure from the amino acid sequences of proteins? Bearing in mind the importance of protein folding, misfolding, aggregation and assembly in many different disciplines, from biophysics to biomedicine, finding solutions that would be generally applicable is of the utmost importance in biosciences. https://am.pictet/en/luxembourg/mega/2023/power-of-proteins Quote Power of proteins IMPLICATIONS OF AI ADVANCES IN PROTEIN FOLDING August 2023 AI has enabled us to crack the secret of protein folding, opening the door to faster drug development, more resilient crops and bacteria-fuelled recycling. Proteins are at the heart of cells, and cells are the building blocks of life. Understanding how protein structures form and change is key to understanding biology, paving the way for faster development of new drugs, the creation of more resilient crops and even the breaking down of plastic waste. Yet protein structures have been, until recently, difficult to understand due to their 3-dimensional shape, folded from a linear polymer of the protein’s amino acid building blocks. The folding allows for optimal interactions between the amino acids, and the end result is a bit like an origami made with a string of beads instead of paper. “Determining a protein structure using experiments is labour-intensive and slow. Humanity has only done these a few 100,000 times in the last half century since the first protein structure was determined”, explains Dr Chris Bahl, co-founder of AI Proteins, a drug discovery platform. That may sound like a large number, but it’s tiny compared with the hundreds of millions of possible structures out there. As well as years of painstaking work, elucidating a protein structure has often required costly techniques such as X-ray crystallography and cryo-electron microscopy. That all changed in 2021 with the release of AlphaFold, developed by DeepMind, in partnership with the European Molecular Biology Laboratory (EMBL), an intergovernmental research institute. Using artificial intelligence, AlphaFold can predict a protein structure from its amino acid sequence at a rate that “far outpaces humanity’s ability”, according to Bahl. The tool provides access to over 200 million protein structure predictions. The following year, Facebook parent company, Meta, released a database showing the predicted shape of 600 million proteins from bacteria, viruses and microorganisms that had not yet been characterised. Their approach used a large language model (LLM), since popularised with the launch of ChatGPT, which can predict text from a few letters or words, creating a kind of protein ‘autocomplete.’ A key difference between this and AlphaFold is that the language model does not need information about nearby amino acid sequences or multiple sequence alignments (MSA). MSA queries databases of protein sequences to identify similar sequences that are already known in living organisms. Instead, the language model can predict the structure of proteins that have no resemblance to other known proteins, giving it an advantage for detecting what would happen to a protein if there is a point mutation. The algorithm is not as accurate as AlphaFold, according to researchers, but it is quicker, allowing scientists to predict structures in just two weeks. “I’m so happy to be a scientist who can actually live through this revolution,” says Professor Edith Heard, Director General at EMBL. Crucially, the new discoveries are widely available. AlphaFold is an open access resource, while Meta has published the code used to create its database. This approach gives the algorithms enormous reach and reflects tech companies’ reliance on public data resources to build them: DeepMind's algorithms could only be developed with the data held by EMBL. “If we really wanted to make this a game changer, it had to be open [access], it had to be shared by all,” says Heard. Edited May 8 by Lord Protector 1
ragasto Posted May 19 Posted May 19 ...a Google je izbacio Veo, njihovu verziju text-to-video alata. Za sada je navodno slabiji od Sore, ali eto još jednog jakog konja u trci. https://blog.google/technology/ai/google-generative-ai-veo-imagen-3/#veo 1
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now