Skip to content

Conversation

@Rtur2003
Copy link
Owner

This pull request updates the Netlify function for analysis to act as a proxy that forwards multipart requests to an external inference service, instead of returning a mock response. It also adds new environment variables to support integration with external services such as Spotify, YouTube, and the inference API.

Netlify function enhancements:

  • Replaced the mock/preview response in netlify/functions/analyze.js with logic to forward multipart POST requests to an external inference service, returning the actual service response to the client. This includes error handling for missing configuration and backend unavailability.

Environment variable additions:

  • Added SPOTIFY_REDIRECT_URI, YOUTUBE_OAUTH_CLIENT_ID, YOUTUBE_OAUTH_CLIENT_SECRET, YOUTUBE_OAUTH_REDIRECT_URI, and INFERENCE_API_URL to platform/.env.example to support OAuth flows and inference service configuration.

Copilot AI review requested due to automatic review settings December 18, 2025 20:42
@netlify
Copy link

netlify bot commented Dec 18, 2025

Deploy Preview for crowncode-by-rthur failed. Why did it fail? →

Name Link
🔨 Latest commit 346dac0
🔍 Latest deploy log https://app.netlify.com/projects/crowncode-by-rthur/deploys/69446726e28ab90008d83103

@Rtur2003 Rtur2003 merged commit 5559cb1 into geliştirme Dec 18, 2025
8 of 13 checks passed
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request transforms the Netlify analyze function from returning mock responses to acting as a proxy that forwards multipart requests to an external inference service. It also adds OAuth-related environment variables to support future integration with Spotify and YouTube services.

Key changes:

  • Replaced the mock response logic in the Netlify function with a proxy implementation that forwards requests to an external inference API
  • Added environment variables for Spotify and YouTube OAuth flows, plus an INFERENCE_API_URL for the external service
  • Removed the unused fs import from the Netlify function

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 6 comments.

File Description
netlify/functions/analyze.js Converted from mock response generator to proxy that forwards multipart/form-data requests to external inference service
platform/.env.example Added OAuth redirect URIs for Spotify and YouTube, plus INFERENCE_API_URL configuration

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +26 to +30
const response = await fetch(target, {
method: 'POST',
headers: { 'content-type': contentType },
body: bodyBuffer
})
Copy link

Copilot AI Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The fetch request has no timeout configured, which means it could hang indefinitely if the inference service is slow or unresponsive. This will block the Netlify function and potentially cause user-facing timeouts. Consider adding an AbortSignal with a timeout (e.g., 30 seconds for analysis operations) to ensure the request fails gracefully if the backend takes too long.

Copilot uses AI. Check for mistakes.
SPOTIFY_REDIRECT_URI=http://localhost:3000/api/auth/spotify/callback
YOUTUBE_OAUTH_CLIENT_ID=your-youtube-oauth-client-id
YOUTUBE_OAUTH_CLIENT_SECRET=your-youtube-oauth-client-secret
YOUTUBE_OAUTH_REDIRECT_URI=http://localhost:3000/ai-music-detection
Copy link

Copilot AI Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

INFERENCE_API_URL appears to be a duplicate of NEXT_PUBLIC_API_URL based on the default value (both point to http://localhost:8000). The Netlify function falls back to NEXT_PUBLIC_API_URL if INFERENCE_API_URL is not set (line 9 of analyze.js). Consider either removing INFERENCE_API_URL if it's truly redundant, or documenting why two separate environment variables are needed (e.g., if they should point to different services in production).

Suggested change
YOUTUBE_OAUTH_REDIRECT_URI=http://localhost:3000/ai-music-detection
YOUTUBE_OAUTH_REDIRECT_URI=http://localhost:3000/ai-music-detection
# INFERENCE_API_URL is kept separate from NEXT_PUBLIC_API_URL so that
# AI inference can be hosted on a different service/URL in production.
# In local development both services run on the same backend.

Copilot uses AI. Check for mistakes.
Comment on lines +14 to +16
const target = targetBase.endsWith('/api/analyze')
? targetBase
: `${targetBase.replace(/\/$/, '')}/api/analyze`
Copy link

Copilot AI Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Netlify function forwards requests to '/api/analyze' on the inference service, but the current backend implementation in this repository only exposes '/api/youtube/analyze' (see backend/app/routes/youtube.py). Either the backend needs to be updated to add a unified '/api/analyze' endpoint that handles multiple source types (file, youtube, etc.), or the documentation should clarify that INFERENCE_API_URL should point to a different external service that implements this unified endpoint. Without this endpoint, the proxy will return 404 errors.

Suggested change
const target = targetBase.endsWith('/api/analyze')
? targetBase
: `${targetBase.replace(/\/$/, '')}/api/analyze`
const target = /\/api\/(analyze|youtube\/analyze)$/.test(targetBase)
? targetBase
: `${targetBase.replace(/\/$/, '')}/api/youtube/analyze`

Copilot uses AI. Check for mistakes.
Comment on lines +26 to +38
const response = await fetch(target, {
method: 'POST',
headers: { 'content-type': contentType },
body: bodyBuffer
})

const text = await response.text()
return {
statusCode: response.status,
body: text,
headers: {
'Content-Type': response.headers.get('content-type') || 'application/json'
}
Copy link

Copilot AI Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The proxy only forwards the Content-Type header to the inference service, but other important headers from the original request (such as Accept, User-Agent, or custom headers) are not being forwarded. While Content-Type is the most critical for multipart uploads, consider whether other headers should be preserved. Additionally, headers from the backend response (other than Content-Type) are not being forwarded back to the client, which could include important metadata like rate limit information or custom error details.

Suggested change
const response = await fetch(target, {
method: 'POST',
headers: { 'content-type': contentType },
body: bodyBuffer
})
const text = await response.text()
return {
statusCode: response.status,
body: text,
headers: {
'Content-Type': response.headers.get('content-type') || 'application/json'
}
// Forward most incoming headers to the backend, excluding hop-by-hop / problematic ones.
const incomingHeaders = event.headers || {}
const outgoingHeaders = {}
const headerBlacklist = new Set(['host', 'connection', 'content-length', 'transfer-encoding', 'accept-encoding'])
for (const [name, value] of Object.entries(incomingHeaders)) {
if (typeof value === 'undefined') continue
const lowerName = name.toLowerCase()
if (headerBlacklist.has(lowerName)) continue
// We'll override content-type below with the validated value.
if (lowerName === 'content-type') continue
outgoingHeaders[name] = value
}
// Ensure the validated multipart content-type is sent to the backend.
outgoingHeaders['content-type'] = contentType
const response = await fetch(target, {
method: 'POST',
headers: outgoingHeaders,
body: bodyBuffer
})
const text = await response.text()
// Forward most backend response headers back to the client.
const responseHeaders = {}
const responseHeaderBlacklist = new Set(['connection', 'content-length', 'transfer-encoding'])
response.headers.forEach((value, name) => {
const lowerName = name.toLowerCase()
if (responseHeaderBlacklist.has(lowerName)) return
responseHeaders[name] = value
})
if (!Object.keys(responseHeaders).some((h) => h.toLowerCase() === 'content-type')) {
responseHeaders['Content-Type'] = 'application/json'
}
return {
statusCode: response.status,
body: text,
headers: responseHeaders

Copilot uses AI. Check for mistakes.
Comment on lines +40 to +47
} catch (error) {
return {
statusCode: 502,
body: JSON.stringify({
errors: ['backend_unreachable'],
message: 'Failed to reach inference service'
})
}
Copy link

Copilot AI Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The catch block doesn't log the error details, making debugging difficult when the inference service fails. The error message returned to the client is generic ('Failed to reach inference service'), which doesn't help distinguish between network errors, DNS failures, connection refused, or other issues. Consider logging the error (e.g., console.error(error)) for debugging purposes while keeping the client-facing message generic for security.

Copilot uses AI. Check for mistakes.
return {
statusCode: 200,
body: JSON.stringify({ result, warnings: ['gateway_mock_preview'] })
const bodyBuffer = Buffer.from(event.body || '', event.isBase64Encoded ? 'base64' : 'utf8')
Copy link

Copilot AI Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Buffer encoding logic may corrupt binary data in multipart/form-data requests. When event.isBase64Encoded is false, the code treats the body as UTF-8 text, but multipart form data containing binary files (like audio files) should be treated as binary. This will corrupt the audio file data when forwarding to the inference service. Netlify automatically sets isBase64Encoded to true for binary content, but the utf8 fallback is problematic if that detection fails. Consider using 'binary' or 'latin1' encoding instead of 'utf8' for the non-base64 case, or always treat multipart bodies as binary.

Suggested change
const bodyBuffer = Buffer.from(event.body || '', event.isBase64Encoded ? 'base64' : 'utf8')
const bodyBuffer = Buffer.from(
event.body || '',
event.isBase64Encoded ? 'base64' : 'binary'
)

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants