chore: Remove local news agent files as part of project cleanup

This commit is contained in:
Shubhamsaboo
2025-11-08 22:02:11 -08:00
parent 828995333d
commit ca76169e03
5 changed files with 0 additions and 181 deletions

View File

@@ -87,7 +87,6 @@ A curated collection of **Awesome LLM apps built with RAG, AI Agents, Multi-agen
* [🎵 AI Music Generator Agent](starter_ai_agents/ai_music_generator_agent/)
* [🛫 AI Travel Agent (Local & Cloud)](starter_ai_agents/ai_travel_agent/)
* [✨ Gemini Multimodal Agent](starter_ai_agents/gemini_multimodal_agent_demo/)
* [🌐 Local News Agent (OpenAI Swarm)](starter_ai_agents/local_news_agent_openai_swarm/)
* [🔄 Mixture of Agents](starter_ai_agents/mixture_of_agents/)
* [📊 xAI Finance Agent](starter_ai_agents/xai_finance_agent/)
* [🔍 OpenAI Research Agent](starter_ai_agents/opeani_research_agent/)

View File

@@ -1,2 +0,0 @@
OPENAI_BASE_URL=http://localhost:11434/v1
OPENAI_API_KEY=fake-key

View File

@@ -1,51 +0,0 @@
## 📰 Multi-agent AI news assistant
This Streamlit application implements a sophisticated news processing pipeline using multiple specialized AI agents to search, synthesize, and summarize news articles. It leverages the Llama 3.2 model via Ollama and DuckDuckGo search to provide comprehensive news analysis.
### Features
- Multi-agent architecture with specialized roles:
- News Searcher: Finds recent news articles
- News Synthesizer: Analyzes and combines information
- News Summarizer: Creates concise, professional summaries
- Real-time news search using DuckDuckGo
- AP/Reuters-style summary generation
- User-friendly Streamlit interface
### How to get Started?
1. Clone the GitHub repository
```bash
git clone https://github.com/your-username/ai-news-processor.git
cd awesome-llm-apps/ai_agent_tutorials/local_news_agent_openai_swarm
```
2. Install the required dependencies:
```bash
pip install -r requirements.txt
```
3. Pull and Run Llama 3.2 using Ollama:
```bash
# Pull the model
ollama pull llama3.2
# Verify installation
ollama list
# Run the model (optional test)
ollama run llama3.2
```
4. Create a .env file with your configurations:
```bash
OPENAI_BASE_URL=http://localhost:11434/v1
OPENAI_API_KEY=fake-key
```
5. Run the Streamlit app
```bash
streamlit run news_agent.py
```

View File

@@ -1,124 +0,0 @@
import streamlit as st
from duckduckgo_search import DDGS
from swarm import Swarm, Agent
from datetime import datetime
from dotenv import load_dotenv
load_dotenv()
MODEL = "llama3.2:latest"
client = Swarm()
st.set_page_config(page_title="AI News Processor", page_icon="📰")
st.title("📰 News Inshorts Agent")
def search_news(topic):
"""Search for news articles using DuckDuckGo"""
with DDGS() as ddg:
results = ddg.text(f"{topic} news {datetime.now().strftime('%Y-%m')}", max_results=3)
if results:
news_results = "\n\n".join([
f"Title: {result['title']}\nURL: {result['href']}\nSummary: {result['body']}"
for result in results
])
return news_results
return f"No news found for {topic}."
# Create specialized agents
search_agent = Agent(
name="News Searcher",
instructions="""
You are a news search specialist. Your task is to:
1. Search for the most relevant and recent news on the given topic
2. Ensure the results are from reputable sources
3. Return the raw search results in a structured format
""",
functions=[search_news],
model=MODEL
)
synthesis_agent = Agent(
name="News Synthesizer",
instructions="""
You are a news synthesis expert. Your task is to:
1. Analyze the raw news articles provided
2. Identify the key themes and important information
3. Combine information from multiple sources
4. Create a comprehensive but concise synthesis
5. Focus on facts and maintain journalistic objectivity
6. Write in a clear, professional style
Provide a 2-3 paragraph synthesis of the main points.
""",
model=MODEL
)
summary_agent = Agent(
name="News Summarizer",
instructions="""
You are an expert news summarizer combining AP and Reuters style clarity with digital-age brevity.
Your task:
1. Core Information:
- Lead with the most newsworthy development
- Include key stakeholders and their actions
- Add critical numbers/data if relevant
- Explain why this matters now
- Mention immediate implications
2. Style Guidelines:
- Use strong, active verbs
- Be specific, not general
- Maintain journalistic objectivity
- Make every word count
- Explain technical terms if necessary
Format: Create a single paragraph of 250-400 words that informs and engages.
Pattern: [Major News] + [Key Details/Data] + [Why It Matters/What's Next]
Focus on answering: What happened? Why is it significant? What's the impact?
IMPORTANT: Provide ONLY the summary paragraph. Do not include any introductory phrases,
labels, or meta-text like "Here's a summary" or "In AP/Reuters style."
Start directly with the news content.
""",
model=MODEL
)
def process_news(topic):
"""Run the news processing workflow"""
with st.status("Processing news...", expanded=True) as status:
# Search
status.write("🔍 Searching for news...")
search_response = client.run(
agent=search_agent,
messages=[{"role": "user", "content": f"Find recent news about {topic}"}]
)
raw_news = search_response.messages[-1]["content"]
# Synthesize
status.write("🔄 Synthesizing information...")
synthesis_response = client.run(
agent=synthesis_agent,
messages=[{"role": "user", "content": f"Synthesize these news articles:\n{raw_news}"}]
)
synthesized_news = synthesis_response.messages[-1]["content"]
# Summarize
status.write("📝 Creating summary...")
summary_response = client.run(
agent=summary_agent,
messages=[{"role": "user", "content": f"Summarize this synthesis:\n{synthesized_news}"}]
)
return raw_news, synthesized_news, summary_response.messages[-1]["content"]
# User Interface
topic = st.text_input("Enter news topic:", value="artificial intelligence")
if st.button("Process News", type="primary"):
if topic:
try:
raw_news, synthesized_news, final_summary = process_news(topic)
st.header(f"📝 News Summary: {topic}")
st.markdown(final_summary)
except Exception as e:
st.error(f"An error occurred: {str(e)}")
else:
st.error("Please enter a topic!")

View File

@@ -1,3 +0,0 @@
git+https://github.com/openai/swarm.git
streamlit
duckduckgo-search