第一个LlamaIndex程序

第一个LlamaIndex程序

以下是一个简单的LlamaIndex使用示例,展示了如何读取本地文件、构建索引和查询索引:

先准备下数据:
test/test.txt

Overview
NOTE: This README is not updated as frequently as the documentation. Please check out the documentation above for the latest updates!

Context
LLMs are a phenomenal piece of technology for knowledge generation and reasoning. They are pre-trained on large amounts of publicly available data.
How do we best augment LLMs with our own private data?
We need a comprehensive toolkit to help perform this data augmentation for LLMs.

Proposed Solution
That's where LlamaIndex comes in. LlamaIndex is a "data framework" to help you build LLM apps. It provides the following tools:

Offers data connectors to ingest your existing data sources and data formats (APIs, PDFs, docs, SQL, etc.).
Provides ways to structure your data (indices, graphs) so that this data can be easily used with LLMs.
Provides an advanced retrieval/query interface over your data: Feed in any LLM input prompt, get back retrieved context and knowledge-augmented output.
Allows easy integrations with your outer application framework (e.g. with LangChain, Flask, Docker, ChatGPT, anything else).
LlamaIndex provides tools for both beginner users and advanced users. Our high-level API allows beginner users to use LlamaIndex to ingest and query their data in 5 lines of code. Our lower-level APIs allow advanced users to customize and extend any module (data connectors, indices, retrievers, query engines, reranking modules), to fit their needs.

程序源码:

# 导入所需的库
import os
from dotenv import load_dotenv
load_dotenv(override=True)

# 导入自定义模块
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader

os.environ["OPENAI_API_KEY"] = 'YOUR_OPENAI_API_KEY'


# 从"test"目录中加载数据
documents = SimpleDirectoryReader("test").load_data()

# 创建向量存储索引
index = VectorStoreIndex.from_documents(documents)

# 将索引转换为查询引擎
query_engine = index.as_query_engine()

# 提出查询并获取响应
response = query_engine.query("what is llamaindex?")

# 打印响应
print(response)

输出结果

LlamaIndex is a "data framework" designed to assist in building LLM apps. It offers tools such as data connectors for various data sources, ways to structure data for easy use with LLMs, an advanced retrieval/query interface, and integrations with different application frameworks. It caters to both beginner and advanced users, providing a high-level API for easy data ingestion and querying, as well as lower-level APIs for customization and extension of modules to suit specific needs.

赞(0)

评论 抢沙发

评论前必须登录!