-
-
Notifications
You must be signed in to change notification settings - Fork 20
feat: Added AI-powered ChatBot #71
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
@Chaitu7032 is attempting to deploy a commit to the mihir2006's projects Team on Vercel. A member of the Team first needs to authorize it. |
Thanks for creating this PRWe'll review it as soon as possible. If there are any unresolved review comments, feel free to resolve them. |
Summary of ChangesHello @Chaitu7032, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request significantly enhances the StockVision platform by introducing a comprehensive AI-powered chatbot. The chatbot is designed to provide users with intelligent stock analysis, real-time market insights, and interactive tools for portfolio management. The changes involve the implementation of new backend API endpoints and AI services, along with the integration of a floating chat interface and a dedicated AI chat tab in the frontend dashboard. Additionally, the pull request includes updates to security configurations and the database schema to support the new features. Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
|
@MIHIR2006 please review my PR ... |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request introduces a new AI-powered chatbot feature, including backend FastAPI endpoints, AI services, frontend components, and database schema updates. The implementation is a good start, but there are several critical areas for improvement. My review focuses on addressing performance bottlenecks in asynchronous code, enhancing security by preventing error leakage, improving the robustness of AI interactions, and increasing maintainability by removing hardcoded values and promoting better code practices like dependency injection.
| } | ||
|
|
||
| url = f"https://www.alphavantage.co/query?function=GLOBAL_QUOTE&symbol={symbol}&apikey={api_key}" | ||
| response = requests.get(url) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The requests.get() call is a synchronous, blocking I/O operation. Using it inside an async function will block the server's event loop, severely degrading performance and preventing other requests from being processed concurrently. You must use an asynchronous HTTP client like httpx (which is already in your requirements.txt) to make this request non-blocking.
| self.alpha_vantage_key = os.getenv("ALPHA_VANTAGE_API_KEY") | ||
|
|
||
| if self.openai_api_key: | ||
| openai.api_key = self.openai_api_key |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Setting openai.api_key as a global variable is not thread-safe and can lead to race conditions in a concurrent environment like a web server. It's much safer to pass the API key directly to each API call (e.g., openai.ChatCompletion.create(api_key=self.openai_api_key, ...)). This also aligns with the best practices for the newer openai>=1.0.0 library versions.
| openai.api_key = self.openai_api_key | |
| # openai.api_key = self.openai_api_key |
| end = content.rfind('}') + 1 | ||
| json_str = content[start:end] | ||
| return json.loads(json_str) | ||
| except: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using a bare except: clause is dangerous because it catches all exceptions, including system-exiting ones like SystemExit or KeyboardInterrupt, which can hide critical bugs and make the application unstable. You should at least catch the specific json.JSONDecodeError you expect here, or Exception at a minimum.
| except: | |
| except json.JSONDecodeError: |
| timestamp: datetime | ||
|
|
||
| # In-memory storage for demo (replace with database later) | ||
| chat_sessions = {} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using an in-memory dictionary for chat_sessions is not suitable for a production environment. It will lose all data on server restart, won't scale beyond a single process, and can lead to high memory consumption. The new Prisma schema correctly defines models for persisting this data, and this implementation should be updated to use the database.
| """Main chatbot endpoint""" | ||
| try: | ||
| # Initialize AI analyzer | ||
| ai_analyzer = AIStockAnalyzer() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A new instance of AIStockAnalyzer is created for every chat request. This is inefficient as it can involve re-reading environment variables and other setup costs. You can improve performance by creating a single instance and reusing it across requests. A good way to do this in FastAPI is with a dependency, like so:
# At module level
# You can use @lru_cache for a simple singleton
from functools import lru_cache
@lru_cache()
def get_ai_analyzer():
return AIStockAnalyzer()
@router.post("/chat", response_model=ChatResponse)
async def chat_with_ai(request: ChatRequest, ai_analyzer: AIStockAnalyzer = Depends(get_ai_analyzer)):
# ... use ai_analyzer directly without creating a new instance ...| aiofiles>=23.2.0 | ||
| # AI Chatbot dependencies | ||
| openai==0.28.1 | ||
| requests==2.31.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| </TabsContent> | ||
|
|
||
| {/* Settings Tab */} | ||
| {/* Settings Tab */} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| async def fetch_stock_data(analysis: Dict[str, Any]) -> Dict[str, Any]: | ||
| """Fetch relevant stock data based on AI analysis""" | ||
| data = {} | ||
|
|
||
| try: | ||
| if analysis["action"] == "get_price": | ||
| for symbol in analysis.get("symbols", [])[:5]: # Limit to 5 symbols | ||
| stock_data = await get_real_time_stock_data(symbol) | ||
| data[symbol] = stock_data | ||
|
|
||
| elif analysis["action"] == "get_trends": | ||
| for symbol in analysis.get("symbols", [])[:3]: # Limit to 3 symbols for trends | ||
| trends = await get_stock_trends(symbol, analysis.get("time_range", 30)) | ||
| data[f"{symbol}_trends"] = trends | ||
|
|
||
| elif analysis["action"] == "market_summary": | ||
| # Get summary of major indices | ||
| major_stocks = ["AAPL", "GOOGL", "MSFT"] | ||
| for symbol in major_stocks: | ||
| stock_data = await get_real_time_stock_data(symbol) | ||
| data[symbol] = stock_data | ||
|
|
||
| # If no symbols found, provide general market info | ||
| if not data and analysis["action"] in ["get_price", "get_trends"]: | ||
| data["info"] = "No specific stocks mentioned. Try asking about stocks like AAPL, GOOGL, MSFT, or TSLA." | ||
|
|
||
| except Exception as e: | ||
| data["error"] = f"Error fetching stock data: {str(e)}" | ||
|
|
||
| return data |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| "timestamp": datetime.now() | ||
| }, | ||
| { | ||
| "role": "assistant", | ||
| "content": response, | ||
| "timestamp": datetime.now(), | ||
| "data": data | ||
| } | ||
| ]) | ||
|
|
||
| return ChatResponse( | ||
| response=response, | ||
| data=data, | ||
| session_id=request.session_id, | ||
| timestamp=datetime.now() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You are calling datetime.now() multiple times within this block (lines 49, 54, 63). This can result in slightly different timestamps for the user message, assistant message, and the final response. It's better to capture the timestamp once at the beginning of the request and reuse the same datetime object for consistency.
| 'use client'; | ||
|
|
||
| import { useState, useRef, useEffect } from 'react'; | ||
| import { Card, CardContent, CardHeader, CardTitle } from '../app/components/ui/card'; | ||
| import { Button } from '../app/components/ui/button'; | ||
| import { Input } from '../app/components/ui/input'; | ||
| import { ScrollArea } from '../app/components/ui/scroll-area'; | ||
| import { MessageSquare, Send, Bot, User, Loader2 } from 'lucide-react'; | ||
| import { cn } from '../app/lib/utils'; | ||
|
|
||
| interface ChatMessage { | ||
| id: string; | ||
| role: 'user' | 'assistant'; | ||
| content: string; | ||
| timestamp: Date; | ||
| data?: any; | ||
| } | ||
|
|
||
| interface StockChatbotProps { | ||
| className?: string; | ||
| } | ||
|
|
||
| export function StockChatbot({ className }: StockChatbotProps) { | ||
| const [messages, setMessages] = useState<ChatMessage[]>([ | ||
| { | ||
| id: '1', | ||
| role: 'assistant', | ||
| content: 'Hello! I\'m your AI Stock Analyzer. I can help you with:\n\n• Real-time stock prices and trends\n• Portfolio analysis and comparisons\n• Risk assessment and growth analysis\n• Market insights and recommendations\n\nWhat would you like to analyze today? Try asking:\n- "What\'s the current price of AAPL?"\n- "Show me trends for TSLA"\n- "Give me a market summary"', | ||
| timestamp: new Date() | ||
| } | ||
| ]); | ||
| const [input, setInput] = useState(''); | ||
| const [isLoading, setIsLoading] = useState(false); | ||
| const [sessionId] = useState(() => `session_${Date.now()}`); | ||
| const scrollRef = useRef<HTMLDivElement>(null); | ||
|
|
||
| const scrollToBottom = () => { | ||
| if (scrollRef.current) { | ||
| scrollRef.current.scrollTop = scrollRef.current.scrollHeight; | ||
| } | ||
| }; | ||
|
|
||
| useEffect(() => { | ||
| scrollToBottom(); | ||
| }, [messages]); | ||
|
|
||
| const sendMessage = async () => { | ||
| if (!input.trim() || isLoading) return; | ||
|
|
||
| const userMessage: ChatMessage = { | ||
| id: Date.now().toString(), | ||
| role: 'user', | ||
| content: input, | ||
| timestamp: new Date() | ||
| }; | ||
|
|
||
| setMessages(prev => [...prev, userMessage]); | ||
| const currentInput = input; | ||
| setInput(''); | ||
| setIsLoading(true); | ||
|
|
||
| try { | ||
| const response = await fetch('/api/chatbot/chat', { | ||
| method: 'POST', | ||
| headers: { | ||
| 'Content-Type': 'application/json', | ||
| }, | ||
| body: JSON.stringify({ | ||
| message: currentInput, | ||
| session_id: sessionId | ||
| }), | ||
| }); | ||
|
|
||
| if (!response.ok) { | ||
| throw new Error(`HTTP error! status: ${response.status}`); | ||
| } | ||
|
|
||
| const data = await response.json(); | ||
|
|
||
| const assistantMessage: ChatMessage = { | ||
| id: (Date.now() + 1).toString(), | ||
| role: 'assistant', | ||
| content: data.response, | ||
| timestamp: new Date(), | ||
| data: data.data | ||
| }; | ||
|
|
||
| setMessages(prev => [...prev, assistantMessage]); | ||
| } catch (error) { | ||
| console.error('Chat error:', error); | ||
| const errorMessage: ChatMessage = { | ||
| id: (Date.now() + 1).toString(), | ||
| role: 'assistant', | ||
| content: 'Sorry, I encountered an error processing your request. Please ensure the backend is running and try again.', | ||
| timestamp: new Date() | ||
| }; | ||
| setMessages(prev => [...prev, errorMessage]); | ||
| } finally { | ||
| setIsLoading(false); | ||
| } | ||
| }; | ||
|
|
||
| const handleKeyPress = (e: React.KeyboardEvent) => { | ||
| if (e.key === 'Enter' && !e.shiftKey) { | ||
| e.preventDefault(); | ||
| sendMessage(); | ||
| } | ||
| }; | ||
|
|
||
| const quickQuestions = [ | ||
| "What's the price of AAPL?", | ||
| "Show me TSLA trends", | ||
| "Market summary today", | ||
| "Compare GOOGL vs MSFT" | ||
| ]; | ||
|
|
||
| const handleQuickQuestion = (question: string) => { | ||
| setInput(question); | ||
| }; | ||
|
|
||
| return ( | ||
| <Card className={cn("h-[600px] flex flex-col", className)}> | ||
| <CardHeader className="pb-3"> | ||
| <CardTitle className="flex items-center gap-2"> | ||
| <Bot className="h-5 w-5 text-primary" /> | ||
| AI Stock Analyzer | ||
| </CardTitle> | ||
| </CardHeader> | ||
|
|
||
| <CardContent className="flex-1 flex flex-col p-0"> | ||
| <ScrollArea ref={scrollRef} className="flex-1 px-4"> | ||
| <div className="space-y-4 py-4"> | ||
| {messages.map((message) => ( | ||
| <div | ||
| key={message.id} | ||
| className={cn( | ||
| "flex gap-3", | ||
| message.role === 'user' ? "justify-end" : "justify-start" | ||
| )} | ||
| > | ||
| <div | ||
| className={cn( | ||
| "flex gap-2 max-w-[80%]", | ||
| message.role === 'user' ? "flex-row-reverse" : "flex-row" | ||
| )} | ||
| > | ||
| <div className={cn( | ||
| "w-8 h-8 rounded-full flex items-center justify-center text-xs font-medium shrink-0", | ||
| message.role === 'user' | ||
| ? "bg-primary text-primary-foreground" | ||
| : "bg-muted text-muted-foreground" | ||
| )}> | ||
| {message.role === 'user' ? <User className="h-4 w-4" /> : <Bot className="h-4 w-4" />} | ||
| </div> | ||
|
|
||
| <div className={cn( | ||
| "rounded-lg px-3 py-2 text-sm whitespace-pre-wrap", | ||
| message.role === 'user' | ||
| ? "bg-primary text-primary-foreground" | ||
| : "bg-muted" | ||
| )}> | ||
| {message.content} | ||
|
|
||
| {/* Render data visualizations if available */} | ||
| {message.data && Object.keys(message.data).length > 0 && ( | ||
| <div className="mt-3 p-3 bg-background/20 rounded border"> | ||
| <StockDataVisualization data={message.data} /> | ||
| </div> | ||
| )} | ||
| </div> | ||
| </div> | ||
| </div> | ||
| ))} | ||
|
|
||
| {isLoading && ( | ||
| <div className="flex gap-3"> | ||
| <div className="w-8 h-8 rounded-full bg-muted flex items-center justify-center"> | ||
| <Loader2 className="h-4 w-4 animate-spin" /> | ||
| </div> | ||
| <div className="bg-muted rounded-lg px-3 py-2 text-sm"> | ||
| Analyzing your request... | ||
| </div> | ||
| </div> | ||
| )} | ||
| </div> | ||
| </ScrollArea> | ||
|
|
||
| {/* Quick Questions */} | ||
| <div className="px-4 py-2 border-t bg-muted/20"> | ||
| <div className="text-xs text-muted-foreground mb-2">Quick questions:</div> | ||
| <div className="flex flex-wrap gap-1"> | ||
| {quickQuestions.map((question, index) => ( | ||
| <Button | ||
| key={index} | ||
| variant="outline" | ||
| size="sm" | ||
| className="h-6 text-xs px-2" | ||
| onClick={() => handleQuickQuestion(question)} | ||
| disabled={isLoading} | ||
| > | ||
| {question} | ||
| </Button> | ||
| ))} | ||
| </div> | ||
| </div> | ||
|
|
||
| <div className="p-4 border-t"> | ||
| <div className="flex gap-2"> | ||
| <Input | ||
| value={input} | ||
| onChange={(e) => setInput(e.target.value)} | ||
| onKeyPress={handleKeyPress} | ||
| placeholder="Ask about stocks, portfolios, or market trends..." | ||
| disabled={isLoading} | ||
| className="flex-1" | ||
| /> | ||
| <Button | ||
| onClick={sendMessage} | ||
| disabled={!input.trim() || isLoading} | ||
| size="icon" | ||
| > | ||
| {isLoading ? ( | ||
| <Loader2 className="h-4 w-4 animate-spin" /> | ||
| ) : ( | ||
| <Send className="h-4 w-4" /> | ||
| )} | ||
| </Button> | ||
| </div> | ||
| </div> | ||
| </CardContent> | ||
| </Card> | ||
| ); | ||
| } | ||
|
|
||
| // Component to visualize stock data in chat | ||
| function StockDataVisualization({ data }: { data: any }) { | ||
| if (!data || typeof data !== 'object') return null; | ||
|
|
||
| return ( | ||
| <div className="space-y-3"> | ||
| <div className="text-xs font-semibold text-muted-foreground uppercase tracking-wide"> | ||
| Stock Data | ||
| </div> | ||
|
|
||
| {Object.entries(data).map(([key, value]: [string, any]) => ( | ||
| <div key={key} className="space-y-1"> | ||
| {typeof value === 'object' && value !== null ? ( | ||
| <div className="space-y-1"> | ||
| <div className="text-xs font-medium">{key.toUpperCase()}:</div> | ||
| <div className="pl-2 space-y-1"> | ||
| {Object.entries(value).map(([subKey, subValue]: [string, any]) => ( | ||
| <div key={subKey} className="flex justify-between text-xs"> | ||
| <span className="text-muted-foreground capitalize"> | ||
| {subKey.replace('_', ' ')}: | ||
| </span> | ||
| <span className="font-mono"> | ||
| {typeof subValue === 'number' ? ( | ||
| subKey.includes('price') || subKey.includes('cost') ? | ||
| `$${subValue.toFixed(2)}` : | ||
| subKey.includes('percent') ? | ||
| `${subValue.toFixed(2)}%` : | ||
| subValue.toLocaleString() | ||
| ) : ( | ||
| String(subValue) | ||
| )} | ||
| </span> | ||
| </div> | ||
| ))} | ||
| </div> | ||
| </div> | ||
| ) : ( | ||
| <div className="flex justify-between text-xs"> | ||
| <span className="text-muted-foreground capitalize"> | ||
| {key.replace('_', ' ')}: | ||
| </span> | ||
| <span className="font-mono">{String(value)}</span> | ||
| </div> | ||
| )} | ||
| </div> | ||
| ))} | ||
| </div> | ||
| ); | ||
| } No newline at end of file |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MIHIR2006
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Gemini flagged some high priority issues in this PR. Before we can merge, take a look at these findings and push a commit to address them.
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Ok @MIHIR2006 I will get you this soon with the changes ... Thanks for updating .. |
|
@Chaitu7032 I noticed an issue where the chatbot appears on the landing page when it replaces the button. Instead, the chatbot icon should only be visible on the /dashboard route so that only signed-in users can access it. This also helps reduce API usage limitations. |
|
Yeah @MIHIR2006 I will work upon it and update you ... Actually it is present only in dashboard itself . But for frontend look I kept it .ok i will remove it Nd keep t only inside dash board ... |
|
@Chaitu7032 yeah even small tweaks like these make a big difference in performance and help cut down costs. And honestly, don’t worry too much if your PR doesn’t get merged right away the process itself teaches you a lot. Every time you try, you’re building skills and confidence, and that’s what makes it easier to contribute to bigger repos over time. |
|
I can understand @MIHIR2006 . I tried my level best in integrating my idea in your project . Accepting it , is based upon my extent of work . I will come back to you with changes mentioned. Thanks for guidance .. and support |
Overview


This PR introduces a comprehensive AI-powered chatbot system that enhances the StockVision platform with intelligent stock analysis, real-time market insights, and interactive portfolio management capabilities