Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
161 changes: 161 additions & 0 deletions scripts/dependabot/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,161 @@
# **Dependabot and OpsLevel Integration**

This project provides a Python script that integrates **Dependabot alerts** with **OpsLevel** to automate the tracking and management of vulnerabilities across your repositories. The script fetches alerts from GitHub using the Dependabot API, processes the data, and sends it to OpsLevel via a custom event integration endpoint.

With this integration, you can also create **custom checks** in OpsLevel to ensure that critical vulnerabilities are addressed promptly.

---

## **Features**
- [List Dependabot alerts for a repository](https://docs.github.com/en/rest/dependabot/alerts?apiVersion=2022-11-28#list-dependabot-alerts-for-a-repository) using the GitHub API. Update the API request as needed to use
- [List Dependabot alerts for an enterprise](https://docs.github.com/en/rest/dependabot/alerts?apiVersion=2022-11-28#list-dependabot-alerts-for-an-enterprise)
- [List Dependabot alerts for an organization](https://docs.github.com/en/rest/dependabot/alerts?apiVersion=2022-11-28#list-dependabot-alerts-for-an-organization)
- Processes the alerts and groups them by severity (e.g., `critical`, `high`, `medium`).
- Sends the processed data to OpsLevel's custom event integration endpoint.
- Supports custom OpsLevel checks to monitor and validate vulnerabilities.

---

## **Prerequisites**
1. **GitHub Personal Access Token**:
- Required to authenticate with the GitHub API.
- Token must have the `security_events` scope for accessing Dependabot alerts.

2. **OpsLevel Routing ID**:
- A unique identifier for the OpsLevel custom event integration endpoint.

3. **Python Environment**:
- Python `>=3.7` installed.
- Dependencies installed via `pip` (see [Installation](#installation)).

---

## **Setup**

### **1. Clone the Repository**
```bash
git clone https://github.com/OpsLevel/community-integrations/dependabot.git
cd dependabot-opslevel-integration
```
### **2. Create a Configuration File**

To store your GitHub and OpsLevel credentials securely, create a .env file in the root directory of the project with the following content:
```bash
GITHUB_TOKEN=your_github_personal_access_token
OPSLEVEL_ROUTING_ID=your_opslevel_routing_id
REPO_OWNER=your_repo_owner
REPO_NAME=your_repo_name
```
- **GITHUB_TOKEN**: Your GitHub Personal Access Token with the security_events scope.
- **OPSLEVEL_ROUTING_ID**: The unique routing ID for your OpsLevel custom event integration.
- **REPO_OWNER**: The owner of the repository (organization or username).
- **REPO_NAME**: The name of the repository.

### **3. Install Dependencies**
Install the required Python packages:
```bash
pip install -r requirements.txt
```
### **4. Usage**
Run the Python script to fetch Dependabot alerts and send them to OpsLevel:
```bash
python dependabot_alerts.py
```
### **5. How It Works**
1. **Fetch Dependabot Alerts**:
- The script queries the GitHub API for Dependabot alerts for the specified repository.

2. **Process Alerts**:
- Alerts are grouped by severity (critical, high, medium, etc.).
- Each alert includes details like the dependency, vulnerability description, CVEs, and fix availability.

3. **Send to OpsLevel**:
- The processed data is sent to OpsLevel's custom event integration endpoint.
- Example payload:
```json
{
"dependabot_alerts": {
"high": {
"open": 2,
"alerts": [
{
"fix": "No fix available",
"cves": [
"CVE-2021-23337"
],
"state": "open",
"dependency": "lodash",
"vulnerability": "Command Injection in lodash"
},
{
"fix": "No fix available",
"cves": [
"CVE-2021-23337"
],
"state": "open",
"dependency": "lodash",
"vulnerability": "Command Injection in lodash"
}
],
"closed": 0
},
"medium": {
"open": 2,
"alerts": [
{
"fix": "No fix available",
"cves": [
"CVE-2020-28500"
],
"state": "open",
"dependency": "lodash",
"vulnerability": "Regular Expression Denial of Service (ReDoS) in lodash"
},
{
"fix": "No fix available",
"cves": [
"CVE-2020-28500"
],
"state": "open",
"dependency": "lodash",
"vulnerability": "Regular Expression Denial of Service (ReDoS) in lodash"
}
],
"closed": 0
},
"critical": {
"open": 1,
"alerts": [
{
"fix": "No fix available",
"cves": [
"CVE-2020-6836"
],
"state": "open",
"dependency": "hot-formula-parser",
"vulnerability": "Command Injection in hot-formula-parser"
}
],
"closed": 0
},
"repository": "dependabot-demo"
}
}
```~
## **6. OpsLevel Custom Event Check Setup**
1. **Log in to OpsLevel**.
2. **Create a Custom Event Check**:
- Navigate to **Maturity > Rubric > Add Check**.
- Define a check to query the custom event data for critical vulnerabilities.
3. **Example Check Logic**:
- Component specifier: `.dependabot_alerts | .repository`.
- Success condition: `.dependabot_alerts | select(.repository == $ctx.alias) | .high.open == "0"`
- Result Message:
```bash
{% if check.passed %}
### Check passed
{% else %}
### Check failed
Service **{{ data.dependabot_alerts.repository }}** has **{{ data.dependabot_alerts.high.open }}** unresolved vulnerabilities.
{% endif %}
```
102 changes: 102 additions & 0 deletions scripts/dependabot/dependabot_alerts.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,102 @@
import requests
import json
import os
from collections import defaultdict

# Replace these with your details
GITHUB_TOKEN = os.environ["GITHUB_TOKEN"]
REPO_OWNER = os.environ["REPO_OWNER"]
REPO_NAME = os.environ["REPO_NAME"]
OPSLEVEL_URL = "https://upload.opslevel.com/integrations/custom_event/"
OPSLEVEL_ROUTING_ID = os.environ["OPSLEVEL_ROUTING_ID"]

# GitHub API URL for Dependabot alerts
URL = f"https://api.github.com/repos/{REPO_OWNER}/{REPO_NAME}/dependabot/alerts"

# Headers for GitHub API and OpsLevel
GITHUB_HEADERS = {
"Authorization": f"Bearer {GITHUB_TOKEN}",
"Accept": "application/vnd.github+json",
}

OPSLEVEL_HEADERS = {
"content-type": "application/json",
"X-OpsLevel-Routing-ID": OPSLEVEL_ROUTING_ID,
}

def fetch_dependabot_alerts():
response = requests.get(URL, headers=GITHUB_HEADERS)
if response.status_code != 200:
print(f"Failed to fetch alerts: {response.status_code}")
print(response.json())
return None
return response.json()

def organize_alerts_by_severity(alerts):
# Group alerts by severity
severity_grouped = defaultdict(lambda: {"open": 0, "closed": 0, "alerts": []})
for alert in alerts:
severity = alert['security_advisory']['severity']
state = alert['state']
identifiers = [id['value'] for id in alert['security_advisory']['identifiers'] if id['type'] == 'CVE']

# Increment open or closed alert count
if state == "open":
severity_grouped[severity]["open"] += 1
elif state in {"fixed", "dismissed"}:
severity_grouped[severity]["closed"] += 1

# Add alert details
severity_grouped[severity]["alerts"].append({
"dependency": alert['dependency']['package']['name'],
"vulnerability": alert['security_advisory']['summary'],
"cves": identifiers,
"state": state,
"fix": alert.get('fixed_in', 'No fix available'),
})
return severity_grouped

def send_to_opslevel(payload):
"""
Send the JSON payload to OpsLevel custom event integration.
"""
try:
response = requests.post(OPSLEVEL_URL, headers=OPSLEVEL_HEADERS, json=payload)
if response.status_code == 202:
print("Payload sent successfully to OpsLevel.")
else:
print(f"Failed to send payload: {response.status_code}")
print(response.text)
except requests.RequestException as e:
print(f"Error sending data to OpsLevel: {e}")

def save_to_json(data, filename="dependabot_alerts.json"):
with open(filename, "w") as json_file:
json.dump(data, json_file, indent=4)
print(f"Alerts saved to {filename}")


def main():
# Fetch Dependabot alerts
alerts = fetch_dependabot_alerts()
if not alerts:
print("No alerts to process. Exiting.")
return

# Organize alerts by severity
grouped_alerts = organize_alerts_by_severity(alerts)

# Prepare the payload
ol_req_payload = {
"dependabot_alerts": {
**grouped_alerts,
"repository": f"{REPO_NAME}"
}
}

# Send the payload to OpsLevel
send_to_opslevel(ol_req_payload)
save_to_json(ol_req_payload)

if __name__ == "__main__":
main()
105 changes: 105 additions & 0 deletions scripts/export_users_and_teams/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
```markdown
# OpsLevel Teams and Users Exporter

This Python script retrieves team and user data from OpsLevel using the GraphQL API and exports it to a CSV file.

## Prerequisites

* Python 3.6 or later
* `requests` library (`pip install requests`)
* OpsLevel API token (set as environment variable `OPSLEVEL_API_TOKEN`)

## Setup

1. **Install Python Dependencies:**

```bash
pip install requests
```

2. **Set OpsLevel API Token:**

Set your OpsLevel API token as an environment variable named `OPSLEVEL_API_TOKEN` and the file path as `OUTPUT_CSV_PATH`.

* **Linux/macOS:**

```bash
export OPSLEVEL_API_TOKEN="your_opslevel_api_token"
export OUTPUT_CSV_PATH="path_to_write_file"
```

* **Windows (Command Prompt):**

```bash
set OPSLEVEL_API_TOKEN=your_opslevel_api_token
set OUTPUT_CSV_PATH=path_to_write_file
```

* **Windows (PowerShell):**

```powershell
$env:OPSLEVEL_API_TOKEN = "your_opslevel_api_token"
$env:OUTPUT_CSV_PATH = "path_to_write_file"
```

* **Best practice:** For production systems, consider using more secure methods to store and retrieve your API token, such as environment files or secret management tools.

## Usage

1. **Run the script:**

```bash
python your_script_name.py
```

Replace `your_script_name.py` with the actual name of your Python script.

2. **Output:**

The script will create a CSV file named `teams_and_users.csv` in the same directory as the script. This file will contain the exported team and user data.

## CSV File Structure

The `teams_and_users.csv` file will have the following columns:

* `Name`
* `Team Alias`
* `Contact Type`
* `Contact Display Name`
* `Contact Address`
* `User ID`
* `User Name`
* `User Email`
* `Membership Role`

## Script Explanation

The script consists of two main functions:

* **`get_all_teams_and_users(api_token)`:**
* Retrieves team and user data from the OpsLevel GraphQL API.
* Handles pagination to retrieve all data.
* Returns a list of team data dictionaries.
* **`export_teams_and_users_to_csv(teams_data, output_csv_path="teams_and_users.csv")`:**
* Exports the team and user data to a CSV file.
* Handles cases where teams may not have contacts or memberships.
* Writes the data in a structured format.
* **Main execution (`if __name__ == "__main__":`)**
* Retrieves the API token from the environment variables.
* Calls the `get_all_teams_and_users` function to retrieve the data.
* Calls the `export_teams_and_users_to_csv` function to export the data to a CSV file.

## Error Handling

The script includes error handling for:

* Invalid API responses.
* API request errors.
* CSV file writing errors.
* Missing API token environment variable.

## Notes

* Ensure that you have the necessary permissions to access the OpsLevel API.
* The CSV file will overwrite any existing file with the same name.
```
Loading