References
Table of Contents:
Neuralink
- “Breakthroughs in Brain Implants,” IEEE Journals & Magazine, Dec. 2023. [Online]. Available: https://ieeexplore.ieee.org/document/10443814.
- R. Hart, “Elon Musk Teases First Neuralink Products After Company Implants First Brain Chip In Human,” Forbes, Jan. 30, 2024. [Online]. Available: https://www.forbes.com/sites/roberthart/2024/01/30/elon-musk-teases-first-neuralink-products-after-company-implants-first-brain-chip-in-human/.
- Neuralink Official website. [Online]. Available: Neuralink.
- “Neuralink’s Brain Chip: How It Works and What It Means,” Capitol Technology University, Feb. 9, 2024. [Online]. Available: https://www.captechu.edu/blog/neuralinks-brain-chip-how-it-works-and-what-it-means.
- L. Rinser, “Neuralink’s Blindsight Device: Restoring Vision Through Innovation,” Healthcare Minutes, Oct. 9, 2024. [Online]. Available: https://www.healthcareminutes.com/p/neuralinks-blindsight-device-restoring?utm_source=%2Fsearch%2Fneuralink&utm_medium=reader2.
The Oracle of Delphi
- S. Biderman et al., “Pythia: A Suite for Analyzing Large Language Models Across Training and Scaling,” in Proceedings of the 40th International Conference on Machine Learning (ICML 2023), PMLR, vol. 202, pp. 2397–2430, 2023.
- A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” in Proc. 31st Int. Conf. Neural Information Processing Systems (NIPS 2017), Curran Associates Inc., 2017, pp. 6000–6010.
- S. Raschka, “10 AI research papers of 2023,” Ahead of AI, Dec. 2023. [Online]. Available: https://magazine.sebastianraschka.com/
- R. Dandekar, “GitHub - RajDandekar repository,” 2024. [Online]. Available: https://github.com/RajDandekar. [Accessed: 15 Dec. 2024].
How to Build Your Own Chatbot?
- M. Marcus, B. Santorini, and M.A. Marcinkiewicz, “Building a large annotated corpus of English: the Penn Treebank,” in Proc. of the 32nd annual meeting on Association for Computational Linguistics, 1993, pp. 313-330. doi: 10.3115/981658.981684.
- J. Pennington, R. Socher, and C. Manning, “GloVe: Global Vectors for Word Representation,” IEEE Trans. Neural Netw. Learn. Syst., vol. 15, no. 8, pp. 1537-1546, Dec. 2014, doi: 10.1109/TNNLS.2014.2311991.
- T. Mikolov, K. Chen, G. Corrado, and J. Dean, “Distributed Representations of Words and Phrases and their Compositionality,” arXiv:1505.05192, 2015. [Online]. Available: https://arxiv.org/abs/1505.05192.
- OpenAI, “Language Models are Unsupervised Multitask Learners,” [Online]. Available: https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf.
- D. Li, et al., “SuperGLUE: A Stickier Benchmark for General-Purpose Language Understanding Systems,” arXiv:2206.07682, 2022. [Online]. Available: https://arxiv.org/abs/2206.07682.
- J. Brown, A. Smith, and Y. Wang, “DeBERTa: Decoding-enhanced BERT with Disentangled Attention,” arXiv:2201.11903, 2022. [Online]. Available: https://arxiv.org/abs/2201.11903.
- A. Radford, et al., “Learning Transferable Visual Models From Natural Language Supervision,” arXiv:2302.13971, 2023. [Online]. Available: https://arxiv.org/pdf/2302.13971.
- A. Dosovitskiy, et al., “An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale,” arXiv:2005.14165, 2020. [Online]. Available: https://arxiv.org/abs/2005.14165.
- Z. Yao, et al., “Unsupervised Cross-lingual Representation Learning,” arXiv:2312.10997, 2023. [Online]. Available: https://arxiv.org/abs/2312.10997.
- A. Dosovitskiy, et al., “Efficient Training of BERT Models on Large Datasets,” arXiv:2005.11401, 2020. [Online]. Available: https://arxiv.org/pdf/2005.11401.
- “آیین نامه ها و مقررات آموزشی”، [Online]. Available: https://sep.iau.ir/ayin/Ayinname.asp.
- LangChain, [Online]. Available: https://www.langchain.com/.
- “LangChain Python Documentation,” [Online]. Available: https://python.langchain.com/docs/introduction/.
- “LangChain AI Introduction,” [Online]. Available: https://langchain-ai.github.io/langgraph/tutorials/introduction/.
- “GloVe: Global Vectors for Word Representation,” Stanford University, [Online]. Available: https://nlp.stanford.edu/projects/glove/.
- T. Mikolov, “word2vec,” [Online]. Available: https://en.wikipedia.org/wiki/Word2vec.
- “Prompting Guide: Few-shot Learning,” [Online]. Available: https://www.promptingguide.ai/techniques/fewshot.
- “Prompting Guide: Retrieval-Augmented Generation (RAG),” [Online]. Available: https://www.promptingguide.ai/techniques/rag.
- OpenAI, [Online]. Available: https://openai.com/.
- “Unsupervised Multitask Learning for Language Understanding,” arXiv:2409.18839, 2024. [Online]. Available: https://arxiv.org/abs/2409.18839.
- “Open Data Lab,” [Online]. Available: https://opendatalab.com.
- “MinerU by Open Data Lab,” Hugging Face, [Online]. Available: https://huggingface.co/spaces/opendatalab/MinerU.
- “Open Persian LLM Leaderboard,” Hugging Face, [Online]. Available: https://huggingface.co/spaces/PartAI/open-persian-llm-leaderboard.
The Evolving Role of Hardware in the Future of AI
- F. Fleuret, The Little Book of Deep Learning.
- “NPU vs. GPU: What’s the Difference?”, IBM, [Online]. Available: https://www.ibm.com/think/topics/npu-vs-gpu.
The Art of Steering Language Models
- ChatGPT-4o, “Introduction to Prompt Engineering: From Artificial Intelligence Models,” edited by H. Karimi, Bojnord University, Department of Industrial Engineering, May 24-27, 140. = M. Ariany, “LLM Prompt Engineering,” YouTube. [Online]. Available: https://www.youtube.com/playlist?list=PLKI4_lXzsRRf_DNrqdzFBdV-VqknLGbZ7.