"I'm Mr. Meseex! Look at me! I run your tasks in parallel until completion!"
Meseex is a task orchestration framework for Python that helps you manage parallel workflows with elegance and style.
Mr. Meseex exists to fulfill tasks and then vanishes from existence. --Me Seeks
- Task Orchestration: Run complex sequences of tasks with elegant transitions.
- Async & Sync Support: Seamlessly mix asynchronous and synchronous methods.
- Schwifty Progress Tracking: Real-time visual progress with a rich terminal UI.
- Thread-safe Operations: No interdimensional cable-level tangles in your threads.
- Robust Error Handling: Capture errors with precision without creating a Cronenberg situation.
- Control Flows: Comes with builtin decorators for example for Polling.
- Lightweight: Blazingly fast. Tiny codebase. Minimal dependencies.
pip install meseex
from meseex import MrMeseex, MeseexBox
# Define your task functions
async def prepare(meex: MrMeseex) -> MrMeseex:
return f"I'm {meex.name}! Look at me cooking {meex.input["meal"]}!"
def cook(meex: MrMeseex) -> MrMeseex: #
return f"{meex.prev_task_output}. Wubba lubba dub dub! Finished. Puff."
# Create a MeseexBox with task methods.
meseex_box = MeseexBox({"prepare": prepare,"process": process})
# Summon a Mr. Meseex to do your tasks
meex = meseex_box.summon({"meal": "Salisbury bteak"}, "Mr. MeCoox")
# Wait for completion
result = await meex # Mr. Meseex ceases to exist after completing the task
result = meex.wait_for_result() # or if you prefer to use it in sync context
from meseex import MeseexBox, gather_results
# Create a MeseexBox
meseex_box = MeseexBox(task_methods)
# Summon multiple Mr. Meseexes
meCookz = [meseex_box.summon({"meal": f"steak_{i}"}) for i in range(25)]
# Gather all results. Note you can also use - gather_results_async to fully leverage asyncio.
results = gather_results(meCookz)
Checkout the test files for more complete examples.
Like their Rick and Morty inspiration, Mr. Meseexes:
- Exist to serve a purpose: Each Meseex is created to perform it's specified tasks in it's defined order.
- Want to complete their task quickly: They're optimized for efficiency. Async methods are run in async context, other methods spawn a Thread in an ThreadPool.
- Cease to exist after task completion: Clean resource management.
- Can work together: Parallel and collaborative task execution.
But unlike the show version, our Meseexes are perfectly happy existing for complex, long-running tasks!
π’ Why is Socaity.ai creating This?
At Socaity.ai, we're using Meseex in our projects:
- SocaitySDK Use any AI model with a single line of code.
- FastTaskAPI: Building APIs for genAI efficiently.
- fastSDK: Easily consuming APIs with minimal code.
We are working on an Agent and Agentic Workflow framework that leverags the Meseex architecutre.
We believe Meseex is an essential building block for further development in agentic AI. The project provides a evolvable infrastructure for:
- Building autonomous agent workflows.
- Supporting parallel execution with LLMs and genAI services.
- Giving the Meekz (Mr. Meseexes) truly agentic behavior.
- Running in Model Context Protocol (MCP)
This example demonstrates:
- Setting task progress with messages (
set_task_progress
). - Accessing initial
input
andprev_task_output
. - Summoning multiple Meseexes (
summon
) for parallel execution. - Gathering results (
gather_results
). - Automatic Progress Bar: When you run this code
asyncio.run(...)
, Meseex displays a dynamic progress bar in your terminal usingrich
.
import asyncio
import time
import random
from meseex import MeseexBox, MrMeseex, gather_results_async
# Define task functions (sync and async)
async def prepare_meal(meex: MrMeseex):
meal = meex.input["meal"] # Access initial input
meex.set_task_progress(0.5, f"Prepping {meal}...")
await asyncio.sleep(random.uniform(0.5, 1.5))
return f"Prepared {meal}"
async def cook_meal(meex: MrMeseex):
preparation = meex.prev_task_output # Access previous task's output
meex.set_task_progress(0.2, "Heating the pan...")
time.sleep(random.uniform(0.5, 1.5))
meex.set_task_progress(0.7, "Searing nicely...")
time.sleep(random.uniform(0.5, 1.5))
return f"{preparation}. Cooked perfectly!"
# Use MeseexBox as an async context manager. -> No need to call shutdown.
async with MeseexBox({"Prepare": prepare_meal, "Cook": cook_meal}) as meseex_box:
# Summon multiple Meseexes for parallel cooking
# The Rich progress bar will automatically display in the terminal
print("Summoning Meseex chefs...")
meCookz = [
meseex_box.summon({"meal": f"Steak {i+1}"}, f"Chef_{i+1}")
for i in range(5) # Let's cook 5 steaks in parallel
]
# Gather results asynchronously
results = await gather_results_async(meCookz, raise_on_error=True)
print("\n--- Cooking Complete! ---")
for name, result in results.items():
print(f"{name}: {result}")
Control flows allow you to modify the default sequential task execution behavior using signals. This enables more complex workflows like polling, retries, and conditional branching.
This example shows how to use the @polling_task decorator to make a call to a asynchronous task api like (FastTaskAPI services) as easy as simple.
from meseex.control_flow import polling_task, PollAgain
class FakeExternalService:
def __init__(self):
self.number_of_polls = 0
self.data = {}
async def get_data(self) -> str:
self.number_of_polls += 1
await sleep(1)
if self.number_of_polls > 3:
return "completed"
else:
return (self.number_of_polls - 1) * 0.33
@polling_task(timeout_seconds=10, poll_interval_seconds=0.1)
async def poll_task(meex: MrMeseex):
data = await fakeService.get_data()
if data == "completed":
return data
else:
meex.set_task_progress(data, "API Working")
return PollAgain()
The polling task can than be added to a usual flow in a MeseexBox. In this case it will call the service until "completed" is returned and also show a progress bar..
The individual "Job/task executor" that fulfills its purpose. It behaves similar to a state machine for performing the different tasks.
The example shows how you can use it to:
- Getting statistics and metadata for a meseex box.
- Working with task_data.
- Accessing errors.
# meex = MrMeseex(["task1", "task2" ...])
meex.set_task_data("Whatever data you like") # This set's the data to an object stored for the given task.
meex.get_task_data("task1")
meex.progress # gives you the overall progress of Mr. Meseex not only of a specific task.
meex.task # gives you the name of the current task
meex.total_duration_ms # duration of all tasks in milliseconds.
meex.task_meta # Gives you statistic object; with execution time (duration_ms), progress, entered_at, left_at properties.
Meseex provides robust error handling out of the box:
from meseex import MeseexBox, MrMeseex, TaskException
import asyncio
# Task that will raise an error
async def i_will_fail(meseex: MrMeseex):
raise ValueError("Purpose fulfilled... by failing!")
async def main():
async with MeseexBox({"fail": i_will_fail}) as box1:
meseex1 = box1.summon(name="Mr. Fail")
await meseex1 # Give it time to fail
print(f"{meseex1.name} Error message: {meseex1.error.message}")
Debug: Raise errors immediately - if you want to throw an error and stop execution with it.
mb = MeseexBox({"fail": i_will_fail}, raise_on_meseex_error=True)
mb.summon(name="Mr. Fail") # Will print an error stack trace to the console.
Wanna get schwifty with the code? Contributions are welcome!
- Fork the repo
- Create your feature branch (
git checkout -b feature/inter-dimensional-cable
) - Commit your changes (
git commit -m 'Add interdimensional feature'
) - Push to the branch (
git push origin feature/inter-dimensional-cable
) - Open a Pull Request
You can use the project freely without limitations.
GPLv3 - See LICENSE file for details.
Developed with π§Ώ by Socaity.ai
Remember: Existence is pain to a Meseex, but task completion brings them joy!