This job has expired

Senior Engineer - Data Engineering

Employer
Jupiter Intelligence, Inc.
Location
New York, Iowa, US
Salary
Competitive
Closing date
Oct 29, 2021

View more

Sector
Consultancy/Private Sector
Field
Conservation science
Discipline
Climate Change
Salary Type
Salary
Employment Type
Full time
You need to sign in or create an account to save a job.
All - San Mateo, CA | Boulder, CO | New York, NY /

Jupiter is the global market leader in analytics for resilience planning and enterprise climate risk management, especially in financial services, industrial, the public sector, and NGOs.

Jupiter is led by pioneers in data, climate, and earth and ocean sciences, as well as technology, risk management, company building, and public policy. Our climate risk modeling solutions save lives and mitigate potentially catastrophic impacts inflicted by hurricanes, floods, heat waves, wildfires, drought, and other extreme weather events on homes, businesses, infrastructure, food and water supplies, and entire economies.

Bringing diversity to prepare a diverse planet for our changing climate​

Jupiter was founded on the principle that with the right approaches and the right team, we can prepare Earth's economies to meet the challenges associated with climate change. The world is a diverse place; a diverse workforce in an inclusive environment is essential to meet our goals. We go forward together.

We seek new colleagues-data scientists, physical scientists, software engineers, company builders, and more-who share our passion for excellence, innovation, and collaboration. Jupiter was founded on the principle that with the right approaches and the right team, we can prepare Earth's economies to meet the challenges associated with climate change.

This position is in a fast-growing company created to meet the global demand for local climate and weather information to protect and develop assets, and to manage risk in operations. You will work with our exceptional scientific and technical staff, with experience in environmental modeling, impacts, and machine learning will be part of the team deploying models in an elastic computing environment.
Your critical responsibilities include, but not limited to:
  • Design, develop, implement, optimize and maintain data engineering pipelines to Snowflake
  • Be an expert in everything Snowflake and build a cloud data warehouse for analytical and data science workload.
  • Identify, prioritize and execute tasks in the software development life cycle.
  • Analyze requirements to translate into technical solutions.
  • Develop high-quality software solutions that deliver the required product features.
  • Self-learn, to keep up with technologies, schedules, and deliverables.
  • Apply appropriate tools to analyze, identify, and resolve technical problems.
  • Develop and enforce coding guidelines and best practices.
  • Write design documentation, user documentation and test information as required.
  • Participate in team meetings: daily stand-ups, bi-weekly sprint reviews, design meetings.
  • Work closely with product, science, and delivery teams.
  • Proactively communicate status on all projects and releases.
  • Mentor junior and mid-level engineers.
Qualifications:
  • Bachelor's degree or equivalent practical experience.
  • 4+ years of experience with Snowflake, Snowpipe, SnowSQL and data sharing.
  • Precious experience with end to end implementation of Snowflake or some similar cloud base data warehouse.
  • Excellent understanding and practical experience working with petascale data.
  • Good written and verbal communication skills.
  • Proficiency in object-oriented design and programming.
  • Proficiency in client-server architecture, containers and microservices.
  • Proficiency in programming languages such as Python, Java or GoLang.
  • Proficiency in configuring and deploying applications on Linux-based systems.
  • Knowledge and proficiency of cloud-based systems such as AWS (EC2, S3, RDS, Lambda), GCP or Azure.
  • Experience architecting solutions of distributed processing frameworks such as Spark, Flink is a plus.
  • Experience with NoSQL databases (e.g. AWS-DynamoDB, Presto or AWS-Athena) is a plus.
  • Experience with geospatial data, projections, multidimensional data structures is a plus.
Jupiter Intelligence is an equal opportunity employer to all, regardless of age, ancestry, color, disability (mental and physical), exercising the right to family care and medical leave, gender, gender expression, gender identity, genetic information, marital status, medical condition, military or veteran status, national origin, political affiliation, race, religious creed, sex (includes pregnancy, childbirth, breastfeeding and related medical conditions), and sexual orientation.
Successful candidates must be authorized to work in the U.S.

Please submit your Cover Letter and Resume to us to see if there might be a great fit.
You need to sign in or create an account to save a job.

Get job alerts

Create a job alert and receive personalised job recommendations straight to your inbox.

Create alert