ArchitectureID.ai’s full stack web application.
This full stack web application serves ArchitectureID.ai’s data models with a end-user friendly UI. The application consists of three containerized microservices:
models
tensorflow/serving instance that hosts the tensorflow models generated by arch-id-model. Usesbackend
Browser-friendly python API to wrap the models service and handle image processingfrontend
User-friendly Typescript/React UI.
MODELS_AWS_ACCESS_KEY_ID(required):
AWS access key for S3 bucket for models storageMODELS_AWS_SECRET_ACCESS_KEY(required):
AWS secret key for S3 bucket for models storageMODELS_AWS_REGION(required):
AWS region of S3 bucket for models storageMODELS_S3_ENDPOINT(required):
AWS endpoint of S3 bucket for models storage (probably 's3.${MODELS_AWS_REGION}.amazonaws.com')EC2_HOST(required for deployment):
EC2 instance host domainEC2_USER(required for deployment):
EC2 instance userFRONTEND_AWS_ACCESS_KEY_ID(required for deployment):
AWS secret key for S3 bucket for static site hostingFRONTEND_AWS_SECRET_ACCESS_KEY(required for deployment):
AWS secret key for S3 bucket for static site hostingFRONTEND_AWS_DEFAULT_REGION(required for deployment):
AWS region of S3 bucket for static site hostingFRONTEND_S3_BUCKET_NAME(required for deployment):
AWS bucket name of S3 bucket for static site hosting
To bring up a local development environment, use:
docker -f docker-compose.yml -f docker-compose.dev.yml up -d
The included script ./scripts/deploy.sh will perform a quick single-node deployment to AWS. The script will and serve the models and backend services on the configured AWS EC2 instance at ports 8051 and 80, and build/push the frontend to a S3 bucket. The EC2 instance must have installations of docker and docker-compose V2 and