Job Title: Geospatial Software Engineer
Location: 100% Remote
Duration: Long term contract
STRONG experience working with Golang and large Geospatial Data is MUST HAVE
The Geospatial Software Engineer will be involved in the design of big data solutions that leverage open source and cloud-based solutions within the Location360 enterprise initiative and will work with multiple teams across the organization (i.e., cloud analytics, data architects, business groups). The software engineer will participate in the building of large-scale data processing systems and API’s and should be able to work with the latest open-source technologies. A software engineer should embrace the challenge of dealing with petabytes or even exabytes of data daily in a high-throughput API/microservice ecosystem. A software engineer understands how to apply technologies to solve big data problems and to develop innovative big data solutions. The software engineer generally works on implementing complex projects with a focus on collecting, parsing, managing, analyzing, and making available large sets of data to turn information into insights using multiple platforms. The software engineer should be able to develop prototypes and proof of concepts for the selected solutions. This role will drive the engineering and building of geospatial data assets to support Field Platform and R&D product pipeline.
Key responsibilities include: • Design, build and support of cloud and open-source systems to process geospatial data assets via an API-based platform • Partners with other internal development communities to bring needed data sets into the asset and making data available to the Enterprise and internal development communities • Building highly scalable API’s and associative architecture to support thousands of requests per second • Provides leadership in advancing understanding of environmental/external influences on field performance and risk factors • Working at all stages of the software life cycle: Proof of Concept, MVP, Production, and Deprecation
Minimum Requirements: • BSc degree in Computer Science or relevant job experience. • Minimum of 2-year experience with Python, Java, Go, or similar development languages. • Extensive knowledge in different programming or scripting languages like Go, Scala, Java, Javascript, SQL, Bash, Python and/or R. • Experience developing HTTP APIs (REST and/or GraphQL) that serve up data in an open-source technology, preferably in a cloud environment. • Ability to build and maintain modern cloud architecture, e.g., AWS, Google Cloud, etc. • Experience working with PostgreSQL/PostGIS. • Experience with code versioning and dependency management systems such as GitHub, SVT, and Maven. • Proven success utilizing Docker to build and deploy within a CI/CD Environment, preferably using Kubernetes.
Desirable qualifications: • MSc in Computer Science or related field. • Demonstrated knowledge of open-source geospatial solutions like GeoServer, GeoTrellis, GeoMesa. • Experience with stream processing, e.g., Kafka. • Highly proficient (4 years) in GoLang • Experience working with customers/other developers to deliver full-stack development solutions e.g., collect software, data, and timeline requirements in an Agile environment. • Demonstrated knowledge of agriculture and/or agriculture-oriented businesses. • Experience implementing complex data projects with a focus on collecting, parsing, managing, and delivery of large sets of data to turn information into insights using multiple platforms. • Demonstrated experience adapting to new technologies. • Capable to decide on the needed hardware and software design needs and act according to the decisions. The big data engineer should be able to develop prototypes and proof of concepts for the selected solutions. • Experience with object-oriented design, coding, and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures should be present. • Experience creating cloud computing solutions and web applications leveraging public and private API’s. • Proven experience (2 years) with distributed systems, e.g., Argo, Kubernetes, Spark, distributed databases, grid computing. • Proficient (4+ years) working in a Command Line Interface system e.g., Docker, Argo, K8s, AWS CLI, GCloud, pSQL, SSH
This is a remote position.