Skip to content

Conversation

@ankush-jain-akto
Copy link
Contributor

No description provided.

Copilot AI review requested due to automatic review settings October 25, 2025 17:24
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces a service-to-service (S2S) graph feature for Kubernetes daemonset deployments, enabling tracking and visualization of service communication patterns through graph nodes and edges.

Key Changes:

  • Added SvcToSvcGraphManager to process and manage service communication graph data
  • Integrated graph parameter extraction from HTTP traffic via SampleParser
  • Created DAO layer and database operations for persisting graph nodes and edges

Reviewed Changes

Copilot reviewed 21 out of 21 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
SvcToSvcGraphManager.java Core manager implementing graph construction logic from traffic data
SampleParser.java Extracts K8s graph parameters from HTTP request headers when enabled
HttpResponseParams.java Added field to carry graph parameters through the processing pipeline
DataActor.java / DbActor.java / ClientActor.java Implemented abstract methods for graph data persistence
DbLayer.java Database query and update operations for graph edges and nodes
APICatalogSync.java (both apps) Integrated graph manager into traffic processing workflow
SvcToSvcGraph*.java (DTOs) Data models for graph representation
SvcToSvcGraphEdgesDao.java / SvcToSvcGraphNodesDao.java MongoDB DAO implementations
DbAction.java / struts.xml REST API endpoints for graph data operations
pom.xml files Added Lombok dependency
TestSvcToSvcGraph.java Unit test for graph manager functionality

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +1163 to +1172
Filters.gte(SvcToSvcGraphEdge.CREATTION_EPOCH, startTs),
Filters.lte(SvcToSvcGraphEdge.CREATTION_EPOCH, endTs)
), skip, limit, Sorts.ascending(SvcToSvcGraphEdge.CREATTION_EPOCH));
}

public static List<SvcToSvcGraphNode> findSvcToSvcGraphNodes(int startTs, int endTs, int skip, int limit) {
return SvcToSvcGraphNodesDao.instance.findAll(Filters.and(
Filters.gte(SvcToSvcGraphEdge.CREATTION_EPOCH, startTs),
Filters.lte(SvcToSvcGraphEdge.CREATTION_EPOCH, endTs)
), skip, limit, Sorts.ascending(SvcToSvcGraphEdge.CREATTION_EPOCH));
Copy link

Copilot AI Oct 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Corrected spelling of 'CREATTION_EPOCH' to 'CREATION_EPOCH'.

Suggested change
Filters.gte(SvcToSvcGraphEdge.CREATTION_EPOCH, startTs),
Filters.lte(SvcToSvcGraphEdge.CREATTION_EPOCH, endTs)
), skip, limit, Sorts.ascending(SvcToSvcGraphEdge.CREATTION_EPOCH));
}
public static List<SvcToSvcGraphNode> findSvcToSvcGraphNodes(int startTs, int endTs, int skip, int limit) {
return SvcToSvcGraphNodesDao.instance.findAll(Filters.and(
Filters.gte(SvcToSvcGraphEdge.CREATTION_EPOCH, startTs),
Filters.lte(SvcToSvcGraphEdge.CREATTION_EPOCH, endTs)
), skip, limit, Sorts.ascending(SvcToSvcGraphEdge.CREATTION_EPOCH));
Filters.gte(SvcToSvcGraphEdge.CREATION_EPOCH, startTs),
Filters.lte(SvcToSvcGraphEdge.CREATION_EPOCH, endTs)
), skip, limit, Sorts.ascending(SvcToSvcGraphEdge.CREATION_EPOCH));
}
public static List<SvcToSvcGraphNode> findSvcToSvcGraphNodes(int startTs, int endTs, int skip, int limit) {
return SvcToSvcGraphNodesDao.instance.findAll(Filters.and(
Filters.gte(SvcToSvcGraphEdge.CREATION_EPOCH, startTs),
Filters.lte(SvcToSvcGraphEdge.CREATION_EPOCH, endTs)
), skip, limit, Sorts.ascending(SvcToSvcGraphEdge.CREATION_EPOCH));

Copilot uses AI. Check for mistakes.
Comment on lines +1170 to +1172
Filters.gte(SvcToSvcGraphEdge.CREATTION_EPOCH, startTs),
Filters.lte(SvcToSvcGraphEdge.CREATTION_EPOCH, endTs)
), skip, limit, Sorts.ascending(SvcToSvcGraphEdge.CREATTION_EPOCH));
Copy link

Copilot AI Oct 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The method findSvcToSvcGraphNodes incorrectly references SvcToSvcGraphEdge.CREATTION_EPOCH instead of SvcToSvcGraphNode.CREATTION_EPOCH (or the corrected constant name). This will cause a compilation error or incorrect filtering as it's querying nodes but using edge field constants.

Suggested change
Filters.gte(SvcToSvcGraphEdge.CREATTION_EPOCH, startTs),
Filters.lte(SvcToSvcGraphEdge.CREATTION_EPOCH, endTs)
), skip, limit, Sorts.ascending(SvcToSvcGraphEdge.CREATTION_EPOCH));
Filters.gte(SvcToSvcGraphNode.CREATTION_EPOCH, startTs),
Filters.lte(SvcToSvcGraphNode.CREATTION_EPOCH, endTs)
), skip, limit, Sorts.ascending(SvcToSvcGraphNode.CREATTION_EPOCH));

Copilot uses AI. Check for mistakes.

private String target;

public static final String CREATTION_EPOCH = "creationEpoch";
Copy link

Copilot AI Oct 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Corrected spelling of 'CREATTION_EPOCH' to 'CREATION_EPOCH'.

Suggested change
public static final String CREATTION_EPOCH = "creationEpoch";
public static final String CREATION_EPOCH = "creationEpoch";

Copilot uses AI. Check for mistakes.
Comment on lines +190 to +199
for (String node: newNodes) {
boolean isNewNode = addNode(node);

if (!isNewNode) {
continue;
}

newNodes.remove(node);

}
Copy link

Copilot AI Oct 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Modifying newNodes set while iterating over it will cause a ConcurrentModificationException. Use an iterator with remove() or collect nodes to remove in a separate list.

Copilot uses AI. Check for mistakes.
import com.mongodb.BasicDBList;
import com.mongodb.BasicDBObject;

import org.apache.kafka.common.protocol.types.Field.Str;
Copy link

Copilot AI Oct 25, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unused import statement org.apache.kafka.common.protocol.types.Field.Str should be removed.

Suggested change
import org.apache.kafka.common.protocol.types.Field.Str;

Copilot uses AI. Check for mistakes.
Copy link

@devsecopsbot devsecopsbot bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤖 AI Security analysis: "Multiple package vulnerabilities in jackson-databind were detected in the dependency manifest (libs/dao/pom.xml). These CVEs enable unsafe deserialization and JNDI/gadget-based attacks that can lead to remote code execution when processing untrusted JSON. Address dependency and deserialization controls urgently."

Risk Level AI Score
🟠 HIGH 70.0/100

Top 5 security issues / 126 total (Critical: 0, High: 0, Medium: 126, Low: 0)

Title Location Recommendation
MEDIUM CVE-2020-14060: jackson-databind: serialization in oadd.org.apache.xalan.lib.sq… libs/dao/pom.xml:1 libs/dao/pom.xml: com.fasterxml.jackson.core:[email protected]
MEDIUM CVE-2020-14061: jackson-databind: serialization in weblogic/oracle-aqjms libs/dao/pom.xml:1 libs/dao/pom.xml: com.fasterxml.jackson.core:[email protected]
MEDIUM CVE-2020-14062: jackson-databind: serialization in com.sun.org.apache.xalan.int… libs/dao/pom.xml:1 libs/dao/pom.xml: com.fasterxml.jackson.core:[email protected]
MEDIUM CVE-2020-14195: jackson-databind: serialization in org.jsecurity.realm.jndi.Jnd… libs/dao/pom.xml:1 libs/dao/pom.xml: com.fasterxml.jackson.core:[email protected]
MEDIUM CVE-2020-24616: jackson-databind: mishandles the interaction between serializat… libs/dao/pom.xml:1 libs/dao/pom.xml: com.fasterxml.jackson.core:[email protected]

🔗 View Detailed Report

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants