Skip to content

Conversation

@Chaitu7032
Copy link
Contributor

Overview
This PR introduces a comprehensive AI-powered chatbot system that enhances the StockVision platform with intelligent stock analysis, real-time market insights, and interactive portfolio management capabilities
Screenshot 2025-09-23 180528
Screenshot 2025-09-23 180558

@vercel
Copy link

vercel bot commented Sep 23, 2025

@Chaitu7032 is attempting to deploy a commit to the mihir2006's projects Team on Vercel.

A member of the Team first needs to authorize it.

@github-actions
Copy link

Thanks for creating this PR

We'll review it as soon as possible.
In the meantime, please double-check the file changes and ensure that all commits are accurate.

If there are any unresolved review comments, feel free to resolve them.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @Chaitu7032, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the StockVision platform by introducing a comprehensive AI-powered chatbot. The chatbot is designed to provide users with intelligent stock analysis, real-time market insights, and interactive tools for portfolio management. The changes involve the implementation of new backend API endpoints and AI services, along with the integration of a floating chat interface and a dedicated AI chat tab in the frontend dashboard. Additionally, the pull request includes updates to security configurations and the database schema to support the new features.

Highlights

  • AI Chatbot Integration: A new AI-powered chatbot has been integrated into the StockVision platform, offering intelligent stock analysis, real-time market insights, and interactive portfolio management capabilities.
  • Backend API and Services: New FastAPI routes and services (chatbot.py, ai_services.py) have been added to handle chat interactions, analyze user queries using AI, and fetch relevant stock data. This includes a fallback mechanism if OpenAI API keys are not configured.
  • Frontend User Interface: The frontend now features a floating chatbot component (floating-chatbot.tsx) accessible globally and a dedicated 'AI Chat' tab within the dashboard (dashboard.tsx, ai-chatbot.tsx) for a more comprehensive chat experience, including quick questions and data visualization.
  • Security Enhancements: The .gitignore file has been updated to include more patterns for sensitive data (e.g., .env files, database files, API keys, session files), and a SECURITY.md file has been added to document data protection status and best practices.
  • Database Schema Updates: The Prisma schema (schema.prisma) has been extended with new models for ChatSession, ChatMessage, MessageRole, Portfolio, and PortfolioStock to support persistent chat history and future portfolio management features.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@Chaitu7032
Copy link
Contributor Author

@MIHIR2006 please review my PR ...

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new AI-powered chatbot feature, including backend FastAPI endpoints, AI services, frontend components, and database schema updates. The implementation is a good start, but there are several critical areas for improvement. My review focuses on addressing performance bottlenecks in asynchronous code, enhancing security by preventing error leakage, improving the robustness of AI interactions, and increasing maintainability by removing hardcoded values and promoting better code practices like dependency injection.

}

url = f"https://www.alphavantage.co/query?function=GLOBAL_QUOTE&symbol={symbol}&apikey={api_key}"
response = requests.get(url)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The requests.get() call is a synchronous, blocking I/O operation. Using it inside an async function will block the server's event loop, severely degrading performance and preventing other requests from being processed concurrently. You must use an asynchronous HTTP client like httpx (which is already in your requirements.txt) to make this request non-blocking.

self.alpha_vantage_key = os.getenv("ALPHA_VANTAGE_API_KEY")

if self.openai_api_key:
openai.api_key = self.openai_api_key
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Setting openai.api_key as a global variable is not thread-safe and can lead to race conditions in a concurrent environment like a web server. It's much safer to pass the API key directly to each API call (e.g., openai.ChatCompletion.create(api_key=self.openai_api_key, ...)). This also aligns with the best practices for the newer openai>=1.0.0 library versions.

Suggested change
openai.api_key = self.openai_api_key
# openai.api_key = self.openai_api_key

end = content.rfind('}') + 1
json_str = content[start:end]
return json.loads(json_str)
except:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Using a bare except: clause is dangerous because it catches all exceptions, including system-exiting ones like SystemExit or KeyboardInterrupt, which can hide critical bugs and make the application unstable. You should at least catch the specific json.JSONDecodeError you expect here, or Exception at a minimum.

Suggested change
except:
except json.JSONDecodeError:

timestamp: datetime

# In-memory storage for demo (replace with database later)
chat_sessions = {}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Using an in-memory dictionary for chat_sessions is not suitable for a production environment. It will lose all data on server restart, won't scale beyond a single process, and can lead to high memory consumption. The new Prisma schema correctly defines models for persisting this data, and this implementation should be updated to use the database.

"""Main chatbot endpoint"""
try:
# Initialize AI analyzer
ai_analyzer = AIStockAnalyzer()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

A new instance of AIStockAnalyzer is created for every chat request. This is inefficient as it can involve re-reading environment variables and other setup costs. You can improve performance by creating a single instance and reusing it across requests. A good way to do this in FastAPI is with a dependency, like so:

# At module level
# You can use @lru_cache for a simple singleton
from functools import lru_cache

@lru_cache()
def get_ai_analyzer():
    return AIStockAnalyzer()

@router.post("/chat", response_model=ChatResponse)
async def chat_with_ai(request: ChatRequest, ai_analyzer: AIStockAnalyzer = Depends(get_ai_analyzer)):
    # ... use ai_analyzer directly without creating a new instance ...

aiofiles>=23.2.0
# AI Chatbot dependencies
openai==0.28.1
requests==2.31.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The requests dependency is duplicated. It is already listed on line 23. You can remove this redundant line.

</TabsContent>

{/* Settings Tab */}
{/* Settings Tab */}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This comment {/* Settings Tab */} is a duplicate of the one on the line above. It can be removed to improve code clarity.

Comment on lines +84 to +113
async def fetch_stock_data(analysis: Dict[str, Any]) -> Dict[str, Any]:
"""Fetch relevant stock data based on AI analysis"""
data = {}

try:
if analysis["action"] == "get_price":
for symbol in analysis.get("symbols", [])[:5]: # Limit to 5 symbols
stock_data = await get_real_time_stock_data(symbol)
data[symbol] = stock_data

elif analysis["action"] == "get_trends":
for symbol in analysis.get("symbols", [])[:3]: # Limit to 3 symbols for trends
trends = await get_stock_trends(symbol, analysis.get("time_range", 30))
data[f"{symbol}_trends"] = trends

elif analysis["action"] == "market_summary":
# Get summary of major indices
major_stocks = ["AAPL", "GOOGL", "MSFT"]
for symbol in major_stocks:
stock_data = await get_real_time_stock_data(symbol)
data[symbol] = stock_data

# If no symbols found, provide general market info
if not data and analysis["action"] in ["get_price", "get_trends"]:
data["info"] = "No specific stocks mentioned. Try asking about stocks like AAPL, GOOGL, MSFT, or TSLA."

except Exception as e:
data["error"] = f"Error fetching stock data: {str(e)}"

return data
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This function contains several hardcoded values, such as symbol limits ([:5], [:3]) and the list of major_stocks. These should be defined as constants at the module level or moved to a configuration file. This will improve maintainability and make them easier to find and change in the future.

Comment on lines +49 to +63
"timestamp": datetime.now()
},
{
"role": "assistant",
"content": response,
"timestamp": datetime.now(),
"data": data
}
])

return ChatResponse(
response=response,
data=data,
session_id=request.session_id,
timestamp=datetime.now()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

You are calling datetime.now() multiple times within this block (lines 49, 54, 63). This can result in slightly different timestamps for the user message, assistant message, and the final response. It's better to capture the timestamp once at the beginning of the request and reuse the same datetime object for consistency.

Comment on lines +1 to +283
'use client';

import { useState, useRef, useEffect } from 'react';
import { Card, CardContent, CardHeader, CardTitle } from '../app/components/ui/card';
import { Button } from '../app/components/ui/button';
import { Input } from '../app/components/ui/input';
import { ScrollArea } from '../app/components/ui/scroll-area';
import { MessageSquare, Send, Bot, User, Loader2 } from 'lucide-react';
import { cn } from '../app/lib/utils';

interface ChatMessage {
id: string;
role: 'user' | 'assistant';
content: string;
timestamp: Date;
data?: any;
}

interface StockChatbotProps {
className?: string;
}

export function StockChatbot({ className }: StockChatbotProps) {
const [messages, setMessages] = useState<ChatMessage[]>([
{
id: '1',
role: 'assistant',
content: 'Hello! I\'m your AI Stock Analyzer. I can help you with:\n\n• Real-time stock prices and trends\n• Portfolio analysis and comparisons\n• Risk assessment and growth analysis\n• Market insights and recommendations\n\nWhat would you like to analyze today? Try asking:\n- "What\'s the current price of AAPL?"\n- "Show me trends for TSLA"\n- "Give me a market summary"',
timestamp: new Date()
}
]);
const [input, setInput] = useState('');
const [isLoading, setIsLoading] = useState(false);
const [sessionId] = useState(() => `session_${Date.now()}`);
const scrollRef = useRef<HTMLDivElement>(null);

const scrollToBottom = () => {
if (scrollRef.current) {
scrollRef.current.scrollTop = scrollRef.current.scrollHeight;
}
};

useEffect(() => {
scrollToBottom();
}, [messages]);

const sendMessage = async () => {
if (!input.trim() || isLoading) return;

const userMessage: ChatMessage = {
id: Date.now().toString(),
role: 'user',
content: input,
timestamp: new Date()
};

setMessages(prev => [...prev, userMessage]);
const currentInput = input;
setInput('');
setIsLoading(true);

try {
const response = await fetch('/api/chatbot/chat', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
message: currentInput,
session_id: sessionId
}),
});

if (!response.ok) {
throw new Error(`HTTP error! status: ${response.status}`);
}

const data = await response.json();

const assistantMessage: ChatMessage = {
id: (Date.now() + 1).toString(),
role: 'assistant',
content: data.response,
timestamp: new Date(),
data: data.data
};

setMessages(prev => [...prev, assistantMessage]);
} catch (error) {
console.error('Chat error:', error);
const errorMessage: ChatMessage = {
id: (Date.now() + 1).toString(),
role: 'assistant',
content: 'Sorry, I encountered an error processing your request. Please ensure the backend is running and try again.',
timestamp: new Date()
};
setMessages(prev => [...prev, errorMessage]);
} finally {
setIsLoading(false);
}
};

const handleKeyPress = (e: React.KeyboardEvent) => {
if (e.key === 'Enter' && !e.shiftKey) {
e.preventDefault();
sendMessage();
}
};

const quickQuestions = [
"What's the price of AAPL?",
"Show me TSLA trends",
"Market summary today",
"Compare GOOGL vs MSFT"
];

const handleQuickQuestion = (question: string) => {
setInput(question);
};

return (
<Card className={cn("h-[600px] flex flex-col", className)}>
<CardHeader className="pb-3">
<CardTitle className="flex items-center gap-2">
<Bot className="h-5 w-5 text-primary" />
AI Stock Analyzer
</CardTitle>
</CardHeader>

<CardContent className="flex-1 flex flex-col p-0">
<ScrollArea ref={scrollRef} className="flex-1 px-4">
<div className="space-y-4 py-4">
{messages.map((message) => (
<div
key={message.id}
className={cn(
"flex gap-3",
message.role === 'user' ? "justify-end" : "justify-start"
)}
>
<div
className={cn(
"flex gap-2 max-w-[80%]",
message.role === 'user' ? "flex-row-reverse" : "flex-row"
)}
>
<div className={cn(
"w-8 h-8 rounded-full flex items-center justify-center text-xs font-medium shrink-0",
message.role === 'user'
? "bg-primary text-primary-foreground"
: "bg-muted text-muted-foreground"
)}>
{message.role === 'user' ? <User className="h-4 w-4" /> : <Bot className="h-4 w-4" />}
</div>

<div className={cn(
"rounded-lg px-3 py-2 text-sm whitespace-pre-wrap",
message.role === 'user'
? "bg-primary text-primary-foreground"
: "bg-muted"
)}>
{message.content}

{/* Render data visualizations if available */}
{message.data && Object.keys(message.data).length > 0 && (
<div className="mt-3 p-3 bg-background/20 rounded border">
<StockDataVisualization data={message.data} />
</div>
)}
</div>
</div>
</div>
))}

{isLoading && (
<div className="flex gap-3">
<div className="w-8 h-8 rounded-full bg-muted flex items-center justify-center">
<Loader2 className="h-4 w-4 animate-spin" />
</div>
<div className="bg-muted rounded-lg px-3 py-2 text-sm">
Analyzing your request...
</div>
</div>
)}
</div>
</ScrollArea>

{/* Quick Questions */}
<div className="px-4 py-2 border-t bg-muted/20">
<div className="text-xs text-muted-foreground mb-2">Quick questions:</div>
<div className="flex flex-wrap gap-1">
{quickQuestions.map((question, index) => (
<Button
key={index}
variant="outline"
size="sm"
className="h-6 text-xs px-2"
onClick={() => handleQuickQuestion(question)}
disabled={isLoading}
>
{question}
</Button>
))}
</div>
</div>

<div className="p-4 border-t">
<div className="flex gap-2">
<Input
value={input}
onChange={(e) => setInput(e.target.value)}
onKeyPress={handleKeyPress}
placeholder="Ask about stocks, portfolios, or market trends..."
disabled={isLoading}
className="flex-1"
/>
<Button
onClick={sendMessage}
disabled={!input.trim() || isLoading}
size="icon"
>
{isLoading ? (
<Loader2 className="h-4 w-4 animate-spin" />
) : (
<Send className="h-4 w-4" />
)}
</Button>
</div>
</div>
</CardContent>
</Card>
);
}

// Component to visualize stock data in chat
function StockDataVisualization({ data }: { data: any }) {
if (!data || typeof data !== 'object') return null;

return (
<div className="space-y-3">
<div className="text-xs font-semibold text-muted-foreground uppercase tracking-wide">
Stock Data
</div>

{Object.entries(data).map(([key, value]: [string, any]) => (
<div key={key} className="space-y-1">
{typeof value === 'object' && value !== null ? (
<div className="space-y-1">
<div className="text-xs font-medium">{key.toUpperCase()}:</div>
<div className="pl-2 space-y-1">
{Object.entries(value).map(([subKey, subValue]: [string, any]) => (
<div key={subKey} className="flex justify-between text-xs">
<span className="text-muted-foreground capitalize">
{subKey.replace('_', ' ')}:
</span>
<span className="font-mono">
{typeof subValue === 'number' ? (
subKey.includes('price') || subKey.includes('cost') ?
`$${subValue.toFixed(2)}` :
subKey.includes('percent') ?
`${subValue.toFixed(2)}%` :
subValue.toLocaleString()
) : (
String(subValue)
)}
</span>
</div>
))}
</div>
</div>
) : (
<div className="flex justify-between text-xs">
<span className="text-muted-foreground capitalize">
{key.replace('_', ' ')}:
</span>
<span className="font-mono">{String(value)}</span>
</div>
)}
</div>
))}
</div>
);
} No newline at end of file
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

This file appears to be new but is not used anywhere in the application according to the changes in this pull request. If it's not needed, it should be removed to avoid cluttering the codebase with dead code.

Copy link
Owner

@MIHIR2006 MIHIR2006 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Gemini flagged some high priority issues in this PR. Before we can merge, take a look at these findings and push a commit to address them.

@vercel
Copy link

vercel bot commented Sep 23, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
stock-vision Ready Ready Preview Comment Sep 23, 2025 4:56pm

@Chaitu7032
Copy link
Contributor Author

Ok @MIHIR2006 I will get you this soon with the changes ... Thanks for updating ..

@MIHIR2006
Copy link
Owner

@Chaitu7032 I noticed an issue where the chatbot appears on the landing page when it replaces the button. Instead, the chatbot icon should only be visible on the /dashboard route so that only signed-in users can access it. This also helps reduce API usage limitations.

@Chaitu7032
Copy link
Contributor Author

Yeah @MIHIR2006 I will work upon it and update you ... Actually it is present only in dashboard itself . But for frontend look I kept it .ok i will remove it Nd keep t only inside dash board ...

@MIHIR2006
Copy link
Owner

@Chaitu7032 yeah even small tweaks like these make a big difference in performance and help cut down costs. And honestly, don’t worry too much if your PR doesn’t get merged right away the process itself teaches you a lot. Every time you try, you’re building skills and confidence, and that’s what makes it easier to contribute to bigger repos over time.

@Chaitu7032
Copy link
Contributor Author

I can understand @MIHIR2006 . I tried my level best in integrating my idea in your project . Accepting it , is based upon my extent of work . I will come back to you with changes mentioned. Thanks for guidance .. and support

@Chaitu7032 Chaitu7032 closed this Sep 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants