Description: *W2 Applicants Only - Cannot Support C2C*
The Geospatial Big Data and Cloud Ops Engineer will be involved in the design of big data solutions that leverage open source and cloud-based solutions within the geospatial data enterprise initiative and will work with multiple teams across the organization (i.e. cloud analytics, data architects, business groups). The big data & cloud ops engineer will participate in the building of large-scale data processing systems, is an expert in data warehousing solutions and should be able to work with the latest (NoSQL) database technologies across AWS and Google cloud (or similar).
A big data engineer should embrace the challenge of dealing with petabyte or even exabytes of data daily. A big data engineer understands how to apply technologies to solve big data problems and to develop innovative big data solutions. The big data engineer generally works on implementing complex big data projects with a focus on collecting, parsing, managing, analyzing, and visualizing large sets of data to turn information into insights using multiple platforms. The big data engineer should be able to develop prototypes and proof of concepts for the selected solutions. This role will drive the engineering and building of geospatial data assets to support the company’s Field Platform and R&D product pipeline.
• You will work with other team SREs and Software Engineers to “automate all things” for our hybrid cloud platform.
• Build tooling to improve CI/CD patterns for multiple development teams that are consistent with security and DevOps best practices
• Creating tooling to around GitOps to automate the deployment of code and infrastructure
• Coordinate and assist teams in building competencies with infrastructure using object-oriented programming and configuration management domain specific language
• Create Terraform artifacts to automate the deployment and management of Cloud Infrastructure
• Create Kubernetes related artifacts based on Kustomize
• Continuously improve the observability of the platform's tech stack by integrating logging, testing, metrics, tracing, dashboards, and alerts with CD pipelines
• Experience working with large geospatial data sets applying GDAL and similar spatial libraries
• Ability to quickly learn technologies such as Terraform, Packer, FastAPI, Kubernetes, Nginx, Elasticsearch, PostgreSQL, AWS, GCP, Grafana, Kibana and Google Data Studio
• Basic understanding of OAuth 2.0 and Open ID Connect
• Software engineering experience with programming languages like Python, Golang or Java
• Experience building CI/CD pipelines
• Experience with creating Kubernetes operators using Kubebuilder or Operator SDK
• Experience with Argo CD / Flux / Argo Workflows
• Experience implementing a GitOps workflow
• Experience with object-oriented design, coding, and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures should be present.
• Experience creating cloud computing solutions and web applications leveraging public and private API’s
Contact: [Click Here to Email Your Resumé]