Cloud Data & AI Platform Engineer

Contractual | Azure,DevOps,Data Engineering,Python,API
Birmingham, AL
Job ID: OOJ - 21207
Back to search results
Back to search results
On-Site role

Job Description:
  • Cloud Data & AI Platform Engineer
  • The Cloud Data & AI Platform Engineer is a hands-on technical role responsible for designing, building, and operating advanced data and AI orchestration capabilities within Southern Nuclear’s Azure/Databricks-based Lakehouse environment. This role focuses on enabling reliable, governed, and auditable automation of data and analytics workflows using Azure Databricks and related Azure services.
  • The position supports the SNC Lakehouse by developing orchestration frameworks, AI-enabled processing pipelines, and integration services that extend beyond traditional ETL, while remaining compliant with regulatory, security, and cost-management standards expected in a nuclear operating environment.
  • This role partners closely with data engineering, analytics, and application teams to ensure that AI-enabled solutions are production-ready, maintainable, and aligned with enterprise architecture standards.
Key Responsibilities:
Data & AI Orchestration Engineering:
  • Design and implement reusable orchestration frameworks in Python to manage multi-step analytics, data quality checks, and AI-assisted workflows.
  • Develop controlled agent-based or task-specialized components to support activities such as data validation, metadata enrichment, code generation assistance, and operational diagnostics.
  • Ensure orchestration logic is deterministic, testable, and suitable for regulated production environments.
Azure Databricks Platform Integration:
  • Deploy and operate orchestration and AI-enabled workloads within Azure Databricks, leveraging:
  • Delta Lake and Medallion Architecture (Bronze/Silver/Gold)
  • Databricks Workflows and Jobs
  • Unity Catalog for data governance and access control
  • Partner with Lakehouse platform leads to align solutions with SNC architectural standards.
System & API Integration:
  • Design and implement secure integration patterns with internal SNC applications and approved external vendor systems.
  • Ensure integrations follow enterprise security, identity, and data-handling requirements, including auditability and least-privilege access.
Performance, Cost, and Reliability Management:
  • Monitor and optimize Spark workloads, orchestration processes, and AI service calls to ensure efficient resource utilization.
  • Apply cost-awareness principles consistent with Client AR/CO and cloud financial management expectations.
  • Build solutions that can scale to intermittent high-volume workloads while remaining operationally stable.
Technical Leadership & Standards:
  • Contribute to architectural guidance, design reviews, and technical standards for data and AI solutions in the Lakehouse.
  • Ensure solutions are modular, maintainable, and aligned with long-term platform strategy.
  • Provide clear documentation and handoff materials to support ongoing operations and support teams.
Technical Qualifications:
Core Technical Skills:
  • Python: Advanced proficiency, including object-oriented design and asynchronous or event-driven patterns.
  • Data Engineering: Strong experience with PySpark, Delta Lake, and enterprise data lake architectures.
  • Azure Platform: Practical experience with Azure services such as:
  • Azure Databricks.
  • Azure Functions and/or Logic Apps.
  • Azure Container-based services (as applicable)
  • AI Enablement: Experience implementing AI-assisted or LLM-enabled workflows using structured orchestration patterns (e.g., task-based agents, supervisor/worker models).
Platform & DevOps:
  • Experience with CI/CD pipelines using GitHub Actions or Azure DevOps.
  • Familiarity with infrastructure-as-code or environment configuration management in Azure.
  • Strong understanding of secure development practices in regulated environments.
Professional Experience:
  • 5+ years of experience in software engineering, cloud platform engineering, or data engineering roles.
  • Demonstrated experience delivering production-grade data or analytics solutions in an enterprise environment.
  • Experience in the energy, utilities, nuclear, or other highly regulated industries is strongly preferred.
  • Exposure to handling sensitive operational, telemetry, or regulatory data is a plus.
Performance & Operating Expectations:
  • Accuracy & Auditability: Solutions must be reliable, traceable, and verifiable to support regulatory and operational requirements.
  • Scalability & Resilience: Designs must tolerate variable workload patterns without manual intervention.
Documentation & Standards Compliance:
  • Code adheres to PEP 8 and Client development standards.
  • Comprehensive docstrings, READMEs, and architectural artifacts are required. 
About us:
At our organization, we take our mission and values to heart! We are on a mission to offer more and better jobs all over the world! Our goal is to care for you while you care for our clients and get you paid the highest pay possible. All our associates working with us are expected to embrace our RACE values: R - Results Matter, A- Approachable, C - Care, and E - Emergency i.e. work with a sense of urgency.

For more relevant job opportunities please visit our website: Denken Solutions Careers
Scroll to Top