Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions doc/source/dataprep.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,10 +102,10 @@ The data generator is a python script that generates random stream set and netwo
- **--num_ins:** Number of problem instances to generate, default is 1.
- **--num_stream:** Number of flows in each problem instance, default is 8.
- **--num_sw:** Number of network bridge in each problem instance, default is 8.
- **--period:** Period pattern of stream set.
- **--size:** Size pattern of stream set.
- **--deadline:** Deadline pattern of stream set.
- **--topo:** Topology pattern of network.
- **--period:** Period pattern of stream set. [ns]
- **--size:** Size pattern of stream set. [bytes]
- **--deadline:** Deadline pattern of stream set. [ns]
- **--topo:** Topology pattern of network. [0:line, 1:ring, 2:tree, 3:mesh, 4:mesh_2d]
- **--output:** Output directory for generated files.

To generate a single dataset:
Expand Down
14 changes: 6 additions & 8 deletions doc/source/simulation.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,12 +27,12 @@ python setup.py build_ext --inplace
python3 -m tsnkit.simulation.tas [TASK PATH] [CONFIG PATH]
```

- Task path: The stream set file as described in [previous section](dataprep.md).
- Config path: The folder containing the generated configuration files. The detailed format can be also found in [previous section](dataprep.md). *Please note this path should be the folder path that containing the configuration files, such as `./data/output/`*
- Iter: The number of network cycles to run the simulation. Default is `1` (use `--iter N` to change).
- Verbose: If set to `True` (`--verbose`), the simulator prints detailed logs; otherwise it prints a summary. Default is `False`.
- No-draw: Disable plotting by passing `--no-draw` (useful for benchmarking).

- `task`: The stream set file as described in [previous section](dataprep.md).
- `config`: The folder containing the generated configuration files. The detailed format can be also found in [previous section](dataprep.md). *Note this should be a folder path that containing the configuration files, such as `./data/output/`*
- `--iter`: The number of network cycles to run the simulation. Default is `1` (use `--iter N` to change).
- `--verbose`: If set to `True` (`--verbose`), the simulator prints detailed logs; otherwise it prints a summary. Default is `False`.
- `--no-draw`: Disable plotting by passing `--no-draw`.
- `--output`: Save the simulation logs into a `.csv` file (by default current folder).

The simulator will automatically infer the network settings from the configuration files, thus a separate network path is not required.

Expand All @@ -50,8 +50,6 @@ During the runtime, the script outputs logs as following to show the forwarding
[Listener 8]: Flow 0 - Receive at 8023800
```



The final log indicates any potential errors and the send/receive times for each flow:


Expand Down
40 changes: 32 additions & 8 deletions tsnkit/core/_common.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@

from typing import Any, Sequence
import argparse
import os
from .. import core


Expand All @@ -16,19 +17,42 @@ def parse_command_line_args():
description="Process the stream and network paths."
)

# Add the positional arguments

parser.add_argument("task", type=str, help="The file path to the stream CSV file.")
parser.add_argument("net", type=str, help="The file path to the network CSV file.")
# Switch to optional flags for all parameters
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Misleading comment: The comment "Switch to optional flags for all parameters" is inaccurate since task and net remain as positional arguments, not optional flags. Either update the comment to reflect that only output, workers, and name are optional, or make all arguments truly optional with --task and --net flags.

Suggested change
# Switch to optional flags for all parameters
# Switch to optional flags for output, workers, and name; task and net remain positional

Copilot uses AI. Check for mistakes.
parser.add_argument(
"task",
type=str,
help="The file path to the stream CSV file.",
)
parser.add_argument(
"output", type=str, nargs="?", help="The output folder path.", default="./"
"net",
type=str,
help="The file path to the network CSV file.",
)
parser.add_argument(
"workers", type=int, nargs="?", help="The number of workers.", default=1
"--output",
type=str,
default="./",
nargs="?",
help="The output folder path.",
)
parser.add_argument(
"name", type=str, nargs="?", help="The name of the experiment.", default="-"
"--workers",
type=int,
default=1,
nargs="?",
help="The number of workers.",
)
parser.add_argument(
"--name",
type=str,
default="-",
nargs="?",
help="The name of the experiment.",
)
Comment on lines 26 to +51
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Unnecessary nargs="?" for optional arguments with defaults: The nargs="?" is redundant for optional arguments (--output, --workers, --name) that already have default values specified. The nargs="?" is typically used when you want to distinguish between "flag not provided", "flag provided without value", and "flag provided with value". Since these are optional arguments with defaults, remove nargs="?" for cleaner argument parsing.

Copilot uses AI. Check for mistakes.

parsed = parser.parse_args()
## TODO: Put me somewhere else.
os.makedirs(parsed.output, exist_ok=True)

args = parse_command_line_constants(parser)

Expand Down Expand Up @@ -62,7 +86,7 @@ def benchmark(stream_path, network_path):

if __name__ == "__main__":
args = parse_command_line_args()
benchmark(args.STREAM_PATH, args.NETWORK_PATH)
benchmark(args.task, args.net)


def _interface(name: str) -> Any:
Expand Down
2 changes: 1 addition & 1 deletion tsnkit/core/_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -224,7 +224,7 @@ def is_valid_route_format(init_list: List[List]) -> bool:

@staticmethod
def is_valid_route_logic(init_list: List[List]) -> bool:
## [NOTE] May casue some problem for un-continuous link_id
## [NOTE] May cause some problem for un-continuous link_id
routes: List[List[Union[Tuple[int, int], Link]]] = [
[] for i in range(max([int(x[0]) for x in init_list]) + 1)
]
Expand Down
139 changes: 87 additions & 52 deletions tsnkit/data/dataset_spec.py
Original file line number Diff line number Diff line change
Expand Up @@ -85,19 +85,7 @@ def line(num_sw, num_queue, data_rate, header):
net[i + num_sw, i] = 1
net[i, i + num_sw] = 1

result = []
for i in range(num_node):
for j in range(num_node):
if net[i][j]:
link = []
link.append((i, j))
link.append(num_queue)
link.append(data_rate)
link.append(ERROR)
link.append(0)
result.append(link)

result = pd.DataFrame(result, columns=["link", "q_num", "rate", "t_proc", "t_prop"])
result = _convert_2darray_to_csv(net, num_node, num_queue, data_rate)
result.to_csv(header + ".csv", index=False)
return net

Expand All @@ -119,24 +107,13 @@ def ring(num_sw, num_queue, data_rate, header):
net[0, num_sw - 1] = 1
net[num_sw - 1, 0] = 1

result = []
for i in range(num_node):
for j in range(num_node):
if net[i][j]:
link = []
link.append((i, j))
link.append(num_queue)
link.append(data_rate)
link.append(ERROR)
link.append(0)
result.append(link)

result = pd.DataFrame(result, columns=["link", "q_num", "rate", "t_proc", "t_prop"])
result = _convert_2darray_to_csv(net, num_node, num_queue, data_rate)
result.to_csv(header + ".csv", index=False)
return net


def tree(num_sw, num_queue, data_rate, header):
# Aka. STAR
num_node = num_sw * 2 + 1
net = np.zeros(shape=(num_node, num_node))

Expand All @@ -145,19 +122,8 @@ def tree(num_sw, num_queue, data_rate, header):
net[i * 2 + 1, i] = 1
net[i, i * 2 + 2] = 1
net[i * 2 + 2, i] = 1
result = []
for i in range(num_node):
for j in range(num_node):
if net[i][j]:
link = []
link.append((i, j))
link.append(num_queue)
link.append(data_rate)
link.append(ERROR)
link.append(0)
result.append(link)

result = pd.DataFrame(result, columns=["link", "q_num", "rate", "t_proc", "t_prop"])
result = _convert_2darray_to_csv(net, num_node, num_queue, data_rate)
result.to_csv(header + ".csv", index=False)
return net

Expand All @@ -170,12 +136,13 @@ def mesh(num_sw, num_queue, data_rate, header):
for i in range(0, num_sw - 1):
net[i, i + 1] = 1
net[i + 1, i] = 1

## Connect the switch and the end-station
for i in range(num_sw):
net[i + num_sw, i] = 1
net[i, i + num_sw] = 1

## Connect the mesh
## Connect the ring
net[0, num_sw - 1] = 1
net[num_sw - 1, 0] = 1

Expand All @@ -184,24 +151,70 @@ def mesh(num_sw, num_queue, data_rate, header):
net[i, num_sw - i - 1] = 1
net[num_sw - i - 1, i] = 1

result = []
for i in range(num_node):
for j in range(num_node):
if net[i][j]:
link = []
link.append((i, j))
link.append(num_queue)
link.append(data_rate)
link.append(ERROR)
link.append(0)
result.append(link)
result = _convert_2darray_to_csv(net, num_node, num_queue, data_rate)
result.to_csv(header + ".csv", index=False)
return net

result = pd.DataFrame(result, columns=["link", "q_num", "rate", "t_proc", "t_prop"])
def mesh_2d(num_sw, num_queue, data_rate, header):

n = int(np.sqrt(num_sw))
num_node = num_sw + 4 * n - 4
net = np.zeros(shape=(num_node, num_node))

if n ** 2 != num_sw:
raise ValueError("Wrong num_sw for mesh_2d, col_len != row_len")

row_len = col_len = int(np.sqrt(num_sw))

# Save to mat
mat = []
count = 0
for i in range(row_len):
if i % 2 == 0:
row = list(range(count, count + col_len))
else:
row = list(range(count, count + col_len))[::-1]
Comment on lines +173 to +176
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Undocumented row reversal logic: Lines 171-174 reverse the node numbering for odd-indexed rows in the 2D mesh without explanation. This creates a serpentine/zigzag numbering pattern rather than a simple row-major ordering. If this is intentional (e.g., for routing optimization), it should be documented. Otherwise, this might be unexpected behavior.

Copilot uses AI. Check for mistakes.
mat.append(row)
count += col_len

# Fill net
searched = set()
es_id = num_sw

def _dfs(x, y):
nonlocal es_id, searched
if x < 0 or y < 0 or x >= row_len or y >= col_len:
return
if (x,y) in searched:
return
searched.add((x,y))

# Add es on border
if x == 0 or y == 0 or x == row_len - 1 or y == col_len - 1:
net[mat[x][y]][es_id] = 1
net[es_id][mat[x][y]] = 1
es_id += 1

# Connect sw neighbors
def _search_nxt(x, y, nx, ny):
if 0 <= nx < row_len and 0 <= ny < col_len:
net[mat[x][y]][mat[nx][ny]] = 1
net[mat[nx][ny]][mat[x][y]] = 1
_dfs(nx, ny)

_search_nxt(x, y, x-1, y)
_search_nxt(x, y, x+1, y)
_search_nxt(x, y, x, y-1)
_search_nxt(x, y, x, y+1)

_dfs(0, 0)

result = _convert_2darray_to_csv(net, num_node, num_queue, data_rate)
result.to_csv(header + ".csv", index=False)
return net


TOPO_FUNC = [line, ring, tree, mesh]
TOPO_FUNC = [line, ring, tree, mesh, mesh_2d]


def generate_flowset(
Expand All @@ -227,6 +240,7 @@ def generate_flowset(
result.to_csv(header + ".csv", index=False)
return

# NOTE: Prioritize uti(ES) <= 75% for traffic generation
availble_es = np.argwhere(uti_ports <= 0.75).reshape(-1)
if availble_es.size == 0:
availble_es = np.array([x for x in range(num_es)])
Expand All @@ -251,3 +265,24 @@ def generate_flowset(
i += 1
else:
continue


def _convert_2darray_to_csv(net, num_node, num_queue, data_rate):
result = []
for i in range(num_node):
for j in range(num_node):
if net[i][j]:
link = []
link.append((i, j))
link.append(num_queue)
link.append(data_rate)
link.append(ERROR)
link.append(0)
result.append(link)

result = pd.DataFrame(result, columns=["link", "q_num", "rate", "t_proc", "t_prop"])
return result


if __name__ == "__main__":
mesh_2d(9,1,1,"test")
9 changes: 6 additions & 3 deletions tsnkit/data/generator.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,9 @@
import traceback
import itertools
import pandas as pd
import numpy as np
from tqdm import tqdm
import os
import networkx as nx
from .dataset_spec import generate_flowset
from .dataset_spec import TOPO_FUNC
Expand All @@ -27,6 +29,7 @@ def __init__(self, num_ins, num_stream, num_sw, period, size, deadline, topo):
self.topo = [topo] if isinstance(topo, int) else topo

def run(self, path):
os.makedirs(path, exist_ok=True)
param_combinations = list(itertools.product(
self.num_stream,
self.num_sw,
Expand Down Expand Up @@ -54,14 +57,14 @@ def run(self, path):
data_rate=1,
header=path + header + "_topo",
)
_flowset = generate_flowset(
_ = generate_flowset(
nx.DiGraph(net),
size,
period,
deadline,
num_stream,
num_sw,
num_sw,
num_sw if topo != 4 else int(np.sqrt(num_sw)-1) * 4,
Copy link

Copilot AI Nov 28, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The calculation int(np.sqrt(num_sw)-1) * 4 for mesh_2d (topo == 4) appears to compute the number of border end-stations for a 2D mesh. However, the formula seems incorrect. For a square mesh with side length n = sqrt(num_sw), the number of border nodes should be 4*n - 4 (perimeter nodes), not (n-1) * 4. For example, a 3x3 mesh (9 switches) has 8 border positions, but this formula gives (3-1)*4 = 8, which happens to match. However, for a 4x4 mesh (16 switches), there are 12 border positions, but the formula gives (4-1)*4 = 12. The formula works but should be clarified with a comment explaining why (n-1)*4 = 4*n - 4.

Suggested change
num_sw if topo != 4 else int(np.sqrt(num_sw)-1) * 4,
num_sw if topo != 4 else 4 * int(np.sqrt(num_sw)) - 4, # For square mesh (topo==4), number of border nodes is 4*n-4 where n=sqrt(num_sw)

Copilot uses AI. Check for mistakes.
path + header + "_task",
)
exp_info = [
Expand Down Expand Up @@ -126,7 +129,7 @@ def int_or_int_list(value):
"--topo",
type=int_or_int_list,
default=0,
help="Topology type: 0-Line, 1-Ring, 2-Tree, 3-Mesh",
help="Topology type: 0-Line, 1-Ring, 2-Tree, 3-Mesh, 4-2dMesh",
)
parser.add_argument(
"--output",
Expand Down
Loading