- GitHub - openai gpt-oss: gpt-oss-120b and gpt-oss-20b are two open . . .
Try gpt-oss · Guides · Model card · OpenAI blog Download gpt-oss-120b and gpt-oss-20b on Hugging Face Welcome to the gpt-oss series, OpenAI's open-weight models designed for powerful reasoning, agentic tasks, and versatile developer use cases We're releasing two flavors of these open models: gpt-oss-120b — for production, general purpose, high reasoning use cases that fit into a single
- Awesome GPT - GitHub
Awesome GPT A curated list of awesome projects and resources related to GPT, ChatGPT, OpenAI, LLM, and more
- GitHub - binary-husky gpt_academic: 为GPT GLM等LLM大语言模型提供实用化交互接口,特别优化论文阅读 . . .
GPT 学术优化 (GPT Academic) 如果喜欢这个项目,请给它一个Star;如果您发明了好用的快捷键或插件,欢迎发pull requests! If you like this project, please give it a Star Read this in English | 日本語 | 한국어 | Русский | Français All translations have been provided by the project itself
- GPT-API-free DeepSeek-API-free - GitHub
️ 免费API Key gpt-5系列模型的推理能力较弱,若需要更强的推理能力,可以购买付费API ️ 免费API Key仅可用于个人非商业用途,教育,非营利性科研工作中。
- GitHub - MuiseDestiny zotero-gpt: GPT Meet Zotero.
🧠 Use GPT to generate reply text: support gpt-3 5-turbo and gpt-4 🏷️ Command tags: Click once to accelerate your research 💬 Ask questions about current PDF file (full-text or selected text) 💬 Ask questions about selected paper (Abstract) 📝 Summarize the selected paper into several highly condensed sentences
- GitHub - openai gpt-2: Code for the paper Language Models are . . .
gpt-2 Code and models from the paper "Language Models are Unsupervised Multitask Learners" You can read about GPT-2 and its staged release in our original blog post, 6 month follow-up post, and final post We have also released a dataset for researchers to study their behaviors
- xtekky gpt4free: The official gpt4free repository - GitHub
The official gpt4free repository | various collection of powerful language models | o4, o3 and deepseek r1, gpt-4 1, gemini 2 5 - xtekky gpt4free
- GPT-3: Language Models are Few-Shot Learners - GitHub
Specifically, we train GPT-3, an autoregressive language model with 175 billion parameters, 10x more than any previous non-sparse language model, and test its performance in the few-shot setting
|