Dynamic Systems, Inc. (DSI) is a leading mechanical and process construction contractor with a long history of innovation and technical excellence. We’re building the next generation of internal data, automation, and AI platforms that empower project teams, engineers, and decision-makers across the organization.
Our IT & Data Engineering group is at the forefront of DSI’s digital transformation—consolidating systems, automating workflows, and integrating Microsoft Fabric, Power BI, and custom AI solutions to drive better outcomes across accounting, field operations, and project management.
This team values engineering craftsmanship, cross-disciplinary collaboration, and modern development practices to help DSI scale into a data-driven enterprise.
Overview
We are seeking a Platform Engineer to help design, build, and enhance DSI’s internal platforms, data infrastructure, and AI-powered applications. This role blends backend development, data engineering, and systems integration to support our automation and analytics initiatives. You’ll work closely with developers, data engineers, and AI specialists to deliver reliable, scalable, and intelligent internal tools.
This position is ideal for a developer or data engineer with strong Python experience who enjoys building data-driven systems, connecting services, and working across multiple layers of the technology stack.
Key Responsibilities
1. Platform & Application Development
- Develop and maintain backend APIs and internal services using Python (FastAPI + Uvicorn) and Pydantic v2.
- Support the architecture and implementation of shared tools and libraries across the internal platform.
- Build integrations that connect the automated system, data ingestion workflows, and AI-enabled features.
2. Data Engineering & Pipeline Development
- Design and manage data ingestion pipelines using Fabric, pandas, and Postgres.
- Create repeatable ETL/ELT workflows for processing engineering specifications, RFIs, and compliance documentation. Work with structured and unstructured data to improve data quality, consistency, and accessibility.
- Optimize database performance, queries, and storage structures to support analytics and AI workloads.
- Collaborate with the DevOps and AI teams to implement data versioning, schema evolution, and lineage tracking.
3. AI & System Integration
- Partner with AI/ML engineers to integrate machine learning models and LLM-based features into production applications.
- Develop APIs and services for model inference, data enrichment, and prompt orchestration.
- Contribute to ongoing efforts to embed AI capabilities within internal systems and automation workflows.
4. Quality, Reliability & Observability
- Write clean, tested, and maintainable code using pytest, Ruff, and pre-commit workflows.
- Support CI/CD processes (Hatch, Makefile) to ensure smooth deployment and integration
- Implement monitoring, metrics, and dashboards using Prometheus and Grafana for visibility into system health.
5. Collaboration & Continuous Improvement
- Work closely with data engineers, developers, and DevOps to deliver consistent, well-documented releases.
- Participate in code reviews, design discussions, and architecture improvements.
- Suggest enhancements to developer tooling, automation, and observability practices.
- Contribute to documentation for workflows, data pipelines, and service configurations.

PI2ba130252cd6-37437-39076048