Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API for AI chat completion worked on local but 500 on Cloudflare. Edge runtime related? #1102

Open
AndrewThien opened this issue Mar 2, 2025 · 0 comments

Comments

@AndrewThien
Copy link

Questions & Answers

Context of your question
Hi all,
I am having problems with deploying my NextJS app, which is using an API endpoint for chat completions from OpenRouter AI or OpenAI SDK. The API endpoint below worked well on my local, but when I wanted to deploy my app on Cloudflare Pages, the Internal Server Error 500 keeps showing up.

import { NextResponse } from "next/server";
import OpenAI from "openai";

// Specify Edge runtime
export const runtime = "edge";

export async function POST(req: Request) {
  try {
 
    const openai = new OpenAI({
      apiKey: process.env.OPENAI_API_KEY,
    });
    

    const completion = await openai.chat.completions.create({
      model: 'gpt-4o-mini',
      messages: [
        {
          role: 'user',
          content: systemPrompt,
        },
      ],
    });
    

    return NextResponse.json({ content: completion.choices[0].message.content });
    
  } catch (error: any) {
    console.error("AI API error:", error.message);
    return NextResponse.json({ error: error.message }, { status: 500 });
  }
}

and the client call is just basic:

try {
      const res = await fetch("/api/chat", {
        method: "POST",
        headers: {
          "Content-Type": "application/json",
        },
        body: JSON.stringify({ messages: transformedData }),
      });
      toast.dismiss();
      if (!res.ok) {
        const errorText = await res.text();
        toast.error(`AI failed to think..., ${errorText}`);
        throw new Error(errorText);
      }

      const responseText = await res.json();

      const clean = DOMPurify.sanitize(responseText.content, {
        USE_PROFILES: { html: true },
      });
      setResponse(clean);
      toast.success("AI has responded!");
    } finally {
      setProcessing(false);
    }

I am suspecting some problems with the Edge runtime compatibility between Cloudflare and NextJS. But I still struggle a lot and can't still figure it out.

I have tried to deploy just a bare simple app which only has the UI and the API on Vercel, but the error Internal Server Error 500 is still there.

I also tried to make the API call's base URL through AI gateway of Cloudflare (https://developers.cloudflare.com/ai-gateway/providers/openrouter/), but it is not helping.

Please can someone have a look. Thanks a lot!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant