AI Content Chat (Beta) logo

Technology Vision 2023 | When Atoms meet Bits #TechVision Generalizing AI 165. Zhavoronkov, A. (2021, July 19). Wu Dao 2.0 – Bigger, 176. Alvi, A., and Kharya, P. (2021, October 11). Using DeepSpeed Stronger, Faster AI from China. Forbes: https://www.forbes. and Megatron to Train Megatron-Turing NLG 530B, the 155. Veisdal, J. (2019, September 12). The Birthplace of AI. com/sites/alexzhavoronkov/2021/07/19/wu-dao-20bigger- World’s Largest and Most Powerful Generative Language Cantor’s Paradise: https://www.cantorsparadise.com/the- stronger-faster-ai-from-china/?sh=35dcf4e26fb2 Model. Microsoft Research Blog: https://www.microsoft. birthplace-of-ai-9ab7d4e5fb00 166. Gault, M. (2022, August 31). An AI-Generated Artwork Won com/en-us/research/blog/using-deepspeed-and-megatron- 156. Geoffrey Hinton: a look at the proli昀椀c career of AI and deep First Place at a State Fair Fine Arts Competition, and Artists to-train-megatron-turing-nlg-530b-the-worlds-largest-and- learning pioneer behind backpropagation, unsupervised Are Pissed. Vice: https://www.vice.com/en/article/bvmvqm/ most-powerful-generative-language-model/ learning. (2022, May 19). Ai.nl: https://www.ai.nl/arti昀椀cial- an-ai-generated-artwork-won-昀椀rst-place-at-a-state-fair-昀椀ne- 177. Alford, A. (2022, June 7). Meta Open-Sources 175 Billion arts-competition-and-artists-are-pissed Parameter AI Language Model OPT. InfoQ: https://www.infoq. intelligence/geoffrey-hinton-a-look-at-the-proli昀椀c-career- 167. A Generalist Agent. (2022, November 10). Deepmind.com: com/news/2022/06/meta-opt-175b/ of-ai-and-deep-learning-pioneer-behind-backpropagation- unsupervised-learning/ https://www.deepmind.com/publications/a-generalist-agent 178. Introducing PCL-BAIDU Wenxin (ERNIE 3.0 Titan), the World’s 157. Krizhevsky, A., Sutskever, I., et al. (2012). ImageNet 168. ChatGPT: Optimizing Language Models for Dialogue. (2022, First Knowledge Enhanced Multi-Hundred-Billion Model. November 30). OpenAI: https://openai.com/blog/chatgpt/ (2021, December 28). Baidu Research: http://research.baidu. Classi昀椀cation with Deep Convolutional Neural Networks. 169. Roose, K. (2022, December 5). The Brilliance and Weirdness com/Blog/index-view?id=165 NIPS 2012: https://papers.nips.cc/paper/2012/hash/ c399862d3b9d6b76c8436e924a68c45b-Abstract.html of ChatGPT. The New York Times: https://www.nytimes. 179. Liang, J. (2022, May 1). Foundation Models and the Future of 158. Olanoff, D. (2015, December 11). Arti昀椀cial Intelligence com/2022/12/05/technology/chatgpt-ai-twitter.html Multi-Modal AI. Last Week in AI: https://lastweekin.ai/p/multi- 170. Kreitman, A. (2022, December 13). Why My Writing Career Is modal-ai Nonpro昀椀t OpenAI Launches With Backing From Elon Just About Over. The Marketing Advocate: https://medium. 180. On the Opportunities and Risks of Foundation Models. Musk and Sam Altman. TechCrunch: https://techcrunch. com/the-marketing-advocate/why-my-writing-career-is-just- 181. A Generalist Agent. com/2015/12/11/non-pro昀椀t-openai-launches-with-backing- about-over-deece661eaa2 182. Heikkilä, M. (2022, May 23). The hype around DeepMind’s from-elon-musk-and-sam-altman/ 159. AlphaGo: The Challenge Match. (n.d.). Deepmind.com: 171. GPT-4 is OpenAI’s most advanced system, producing safer new AI model misses what’s actually cool about it. MIT https://www.deepmind.com/research/highlighted-research/ and more useful responses. (n.d.). Technology Review: https://www.technologyreview. alphago/the-challenge-match OpenAI: https://openai.com/product/gpt-4 com/2022/05/23/1052627/deepmind-gato-ai-model-hype/ 160. Vaswani, A., Shazeer, N., et al. (2017, June 12). Attention Is All 172. Wiggers, K. (2023, March 14). OpenAI releases GPT-4, a 183. Merritt, R. (2022, March 25). What is a Transformer Model? You Need. arXiv: https://arxiv.org/abs/1706.03762 multimodal AI that it claims is state-of-the-art. TechCrunch: NVIDIA: https://blogs.nvidia.com/blog/2022/03/25/what-is- 161. Johnson, K. (2019, September 26). Hugging Face https://techcrunch.com/2023/03/14/openai-releases-gpt-4- a-transformer-model/ launches popular Transformers NLP library for TensorFlow. ai-that-it-claims-is-state-of-the-art/ 184. Huge “foundation models” are turbo-charging AI progress. VentureBeat: https://venturebeat.com/business/hugging- 173. Vaswani, A., Shazeer, N., et al. Attention Is All You Need. (2022, June 11). The Economist: https://www.economist.com/ face-launches-popular-transformers-nlp-library-for- 174. Ho, V. (2022, May 24). Build: Azure OpenAI Service helps interactive/brie昀椀ng/2022/06/11/huge-foundation-models- customers accelerate innovation with large AI models; are-turbo-charging-ai-progress tensor昀氀ow/ Microsoft expands availability. Microsoft: https://blogs. 185. Heaven, W. (2021, December 21). 2021 was the year of 162. The Arti昀椀cial Intelligence Act. (n.d.): microsoft.com/ai/azure-openai-service-helps-customers- monster AI models. MIT Technology Review: https://www. https://arti昀椀cialintelligenceact.eu/ accelerate-innovation-with-large-ai-models-microsoft- technologyreview.com/2021/12/21/1042835/2021-was-the- 163. Brown, T., Mann, B., et al. (2020, May 28). Language Models expands-availability/ year-of-monster-ai-models/ are Few-Shot Learners. arXiv: https://arxiv.org/abs/2005.14165 175. Dai, A., and Du, N. (2021, December 9). More Ef昀椀cient In- 186. Alayrac, J., Donahue, J. et al. (2022, April 29). Flamingo: a 164. On the Opportunities and Risks of Foundation Models. (2021, Context Learning with GLaM. Google Research: https:// Visual Language Model for Few-Shot Learning. arXiv: August 16). Stanford University: https://fsi.stanford.edu/ ai.googleblog.com/2021/12/more-ef昀椀cient-in-context- https://arxiv.org/abs/2204.14198 publication/opportunities-and-risks-foundation-models learning-with.html 187. Announcing the 2022 AI Index Report. (2022). Stanford University Human-Centered Arti昀椀cial Intelligence: https://hai.stanford.edu/research/ai-index-2022

When Atoms meet Bits - Page 111 When Atoms meet Bits Page 110 Page 112