Skip to content

conrabeatriz/foot-admin-chatbot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Foot-admin-chatbot

AI Football Field Rental Assistant

A sophisticated AI-powered chatbot designed to manage a Football field rental business. It uses an agent architecture with LangChain and LangGraph to differentiate between Customers and Administrators, routing queries to the correct bot mode.

The user interface is a Telegram bot, making it easily accessible for both managing the business and making reservations. The entire solution is designed to be deployed on Google Cloud Platform for scalability and efficiency.


Architecture

The system is built on a decoupled, microservices-oriented architecture, which ensures scalability, maintainability, and separation of concerns.

  • AI Agent Service (Cloud Run): This is the core "brain" of the operation. It's a FastAPI application that exposes a single API endpoint. It contains the complex LangGraph logic for classifying user roles and generating responses using the agent pipeline. We use Cloud Run for this service as it can handle potentially long-running or memory-intensive AI tasks.

  • Telegram Bot Service (Cloud Function): This is the lightweight front-end handler. It's a serverless Cloud Function with an HTTP trigger that acts as a webhook for the Telegram Bot API. Its sole responsibility is to receive messages, forward them to the AI Agent Service, and relay the response back to the user. This is extremely cost-effective as it only runs when a message is received.

image

Tools and Database Interaction

The agent's functionality is powered by a set of specialized Python functions (@tools) that interact with a Firestore database (or similar backend).

Tool Function Role Description Usage/Flow
get_user_info User Retrieves a user's contact information based on their unique ID (e.g., Telegram ID). Always the first step in the User Flow to identify returning customers.
save_user_info User Saves or updates a user's personal details (name, lastname, cellphone, email). Called after confirming details with a new user or after a returning user confirms any updates.
check_availability User Verifies if a specific date, hour, and field type is free, taking into account the complex field dependency rules (F7 and F11 cannot be booked simultaneously). Mandatory before calling save_field_reservation.
save_field_reservation User Finalizes and records the booking details in the calendar collection. The final step of a successful booking, only after availability is confirmed.
get_admin_reservations Admin Fetches all reservation details, including user contact info, within a specified date range. Sole tool for the Admin Flow, used to quickly view the daily or range-specific schedule.

Agent Flows

The agent employs a state-based graph to strictly enforce the conversational logic and route commands based on the user's role (is_admin: bool). The agent is programmed to always speak in spanish.

1. User Reservation Flow

This flow is designed to be a guided, multi-step process that ensures all necessary data is collected, verified, and confirmed before a booking is finalized.

Step Action/Tool Used Details and Strict Logic
Start Agent Node User sends the initial message.
1. User Check get_user_info The agent's first action is always to check if the user is registered using their unique user_id.
2. Info Collection/Update save_user_info New User: Collects and validates name, lastname, cellphone, and email, then saves them. Existing User: Confirms their details before proceeding.
3. Reservation Details LLM Dialogue Gathers date, hour, field_type (F5, F7, or F11), needs_shirt, and payment_method.
4. Availability Check check_availability Uses the collected details to verify the slot. If Not Available, the agent prompts the user to choose new details and repeats this step.
5. Final Booking save_field_reservation Presents a summary to the user. Only upon final confirmation does the agent call the tool to save the booking.
End Agent Node Confirms the booking is complete and provides a closing message.

2. Admin Scheduling Flow

This flow is straightforward, designed for the administrator to quickly access and view the schedule using a single, powerful tool.

Step Action/Tool Used Details and Strict Logic
Start Agent Node Admin sends a request (e.g., "Show me today's bookings").
1. Date Request LLM Dialogue The agent greets the admin and asks for the specific date or date range (start_date and optional end_date) they wish to view.
2. Retrieve Schedule get_admin_reservations The agent calls the tool with the requested date(s). This tool handles linking reservation records to user contact details.
3. Display Schedule Agent Node The agent formats the output from the tool into a clean, easy-to-read schedule grouped by date, field type, and time.
End Agent Node Asks the admin if they need to check any other dates.

Agent Architecture (LangGraph)

The agent uses a LangGraph state machine to manage the conversation, ensuring the correct tools are used at the right time:

  • AgentState: Manages the conversation history (messages), and vital context variables: user_id, username, and is_admin.
  • Role-Based Tool Binding: The core logic dynamically binds the LLM to either the user_tools or the admin_tools based on the initial is_admin flag, ensuring role security.
  • Routing Logic:
    • The agent_node processes the input and generates a response or a tool call.
    • The should_continue conditional edge checks for a tool call.
    • If a tool call is present, it routes to call_tool.
    • The get_tool_node function dynamically selects and executes the correct tool (user_tool_node or admin_tool_node) based on the role.
    • The result of the tool execution is fed back to the agent_node for the final response.

Tech Stack

  • Backend Logic: Python 3.12
  • AI Frameworks: LangChain, LangGraph
  • LLM Provider: Google Vertex AI (Gemini models)
  • API Framework: FastAPI
  • Telegram Bot: python-telegram-bot library
  • Cloud Platform: Google Cloud Platform (GCP)
  • Deployment: Docker, Cloud Run, Cloud Functions (Gen2)

Project Structure

/Football-field-ai-assistant/
├── .gitignore
├── ai_agent/
│   ├── main.py             # FastAPI app with LangGraph agent logic
│   ├── Dockerfile          # Container definition for Cloud Run
│   ├── requirements.txt    # Python dependencies for the agent
│   ├── information.txt     # Knowledge base for user queries
│   └── .env                # Environment variables
│
└── telegram_bot/
    ├── main.py             # Cloud Function code for the Telegram bot
    ├── requirements.txt    # Python dependencies for the bot
    └── .env.yaml           # Environment variables for GCP deployment

Setup and Deployment

Prerequisites

  • Python 3.12.
  • Google Cloud SDK (gcloud CLI) installed and authenticated.
  • Docker installed and running.
  • A Telegram Bot Token obtained from @BotFather.

Step 1: Clone the Repository

git clone https://github.com/conrabeatriz/foot-admin-chatbot.git
cd foot-admin-chatbot

Step 2: Deploy the AI Agent to Cloud Run

This service will act as our backend brain.

  1. Navigate to the ai_agent directory:

    cd ai_agent
  2. Create a file named .env and add your environment variables:

    # ai_agent/.env
    ADMIN_USERNAMES="YOUR_ADMIN_USERNAMES"

    Note: Replace YOUR_ADMIN_USERNAMES with the Telegram usernames of your admins to ensure the correct mode.

  3. Run the deployment command. Replace YOUR_GCP_PROJECT_ID and YOUR_REGION with your actual GCP project ID and preferred region (e.g., us-central1).

    gcloud run deploy ai-agent-service \
      --source . \
      --project=YOUR_GCP_PROJECT_ID \
      --region=YOUR_REGION \
      --allow-unauthenticated
  4. After the deployment is successful, GCP will provide a Service URL. Copy this URL for the next step.

Step 3: Deploy the Telegram Bot to Cloud Functions

This function will be the public-facing webhook for Telegram.

  1. Navigate to the telegram_bot directory:

    cd ../telegram_bot
  2. Create a file named .env.yaml and add your environment variables:

    # telegram_bot/.env.yaml
    TELEGRAM_API_KEY: "YOUR_TELEGRAM_BOT_TOKEN"
    AI_AGENT_URL: "PASTE_YOUR_CLOUD_RUN_SERVICE_URL_HERE/invoke"

    Note: Replace YOUR_TELEGRAM_BOT_TOKEN with the token from BotFather and PASTE_YOUR_CLOUD_RUN_SERVICE_URL_HERE with the URL from the previous step. Make sure to add /invoke at the end.

  3. Run the deployment command:

    gcloud functions deploy foot_admin_bot \
      --gen2 \
      --runtime=python312 \
      --project=YOUR_GCP_PROJECT_ID \
      --region=YOUR_REGION \
      --source=. \
      --entry-point=foot_admin_bot \
      --trigger-http \
      --allow-unauthenticated \
      --env-vars-file .env.yaml
  4. After deployment, GCP will provide a Function URL (or "Trigger URL"). Copy it.

Step 4: Set the Telegram Webhook

The final step is to tell Telegram where to send messages. Open the following URL in your web browser, replacing the placeholders with your token and Function URL:

https://api.telegram.org/bot{YOUR_TELEGRAM_TOKEN}/setWebhook?url={YOUR_FUNCTION_URL}

You should see a JSON response like {"ok":true,"result":true,"description":"Webhook was set"}.


Usage

Your bot is now live! Open Telegram, find your bot, and start sending it messages.

Customer Examples:

  • "How much does it cost to rent a field for 5 players at night?"
  • "What are your opening hours?"
  • "Do you have locker rooms?"

Admin Examples:

  • "Show me all pending reservations for next week."
  • "Who has a reservation tomorrow at 8 PM?"

AI Assistance & Collaboration Disclaimer

This project, including portions of its code structure, logic, and debugging efforts, was developed with the active assistance of Gemini.

Gemini was primarily utilized for the following tasks:

  • Code Generation: Generating initial drafts or boilerplate code for specific functions or components.
  • Debugging: Identifying and resolving runtime errors, logical flaws, and optimizing code snippets.
  • Conceptualization: Exploring and clarifying complex implementation strategies.

The use of this tool accelerated development and provided valuable assistance in problem-solving. All final integration, testing, and architectural decisions were performed and validated by human developers.

Please note: While every effort has been made to ensure the code is correct and efficient, the output of the language model was treated as a powerful suggestion, not a final solution, and was subject to rigorous human review.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors