Location: REMOTE
Salary: $59.00 USD Hourly - $60.00 USD Hourly
Description:
We are actively seeking a skilled Senior Cloud Data Engineer to spearhead our data initiatives and drive impactful solutions. Your expertise will be pivotal in developing dynamic ELT/ETL data pipelines to efficiently load data into our data lakes and warehouses, applying sophisticated business logic for optimal data structuring. You will provide impactful technical leadership and mentorship, empowering our team to excel in delivering innovative data solutions.
This opportunity allows for remote work for candidates located in the United States.
Your future duties and responsibilities
• Work closely with product owner and BSA to thoroughly understand the business user stories
• Work with the Data Architect to accurately interpret the data modeling and data warehouse to design the data flows
• Design source to target data mappings specifications with the appropriate transformation rules.
• Develop reusable design patterns for standardization of data pipelines
• Develop ELT/ETL data pipelines to load data from various data sources to data lake and data warehouse and apply complex business logic to populate data structures
• Design and implement batch, near real-time, and real-time (data streaming) solutions, as appropriate
• Build reusable components for a scalable data integration methodology such as error handling, logging, and recoverability
• Participate in the data modeling / schema design
• Perform and support data warehouse and data pipeline performance optimization
• Provide technical through leadership and mentorship to others, as needed
Required qualifications to be successful in this role
• Bachelor degree, especially one that's relevant, is a plus.
• Relevant certifications are a plus
• Minimum of 5 years of experience in data engineering role focusing on Data Warehouse and associated Data Pipelines development and support
• 5+ years proven hands-on experience with ETL/ELT tools , specifically Matillion or dbt
• 2-3+ years of hands-on experience with data warehousing on Snowflake
• Expertise in SQL programming and query optimization, especially related to Snowflake
• Proficient in programming languages, especially SQL and Python
• Skilled in integration data from disparate sources, batch and via API leveraging Python
• Proficiency in real-time streaming technologies like Kafka or comparable (experience with Striim a plus).
• Familiarity with cloud platforms specifically AWS, as well as Azure, or GCP.
• Experience with source control (GitHub (preferred), Bitbucket, Git)
• Experience with Agile methodologies
• Ability to multi-task effectively without compromising the quality of the work.
• Detail oriented, organized, process focused, problem solver, proactive, ambitious, customer service focused.
• Ability to draw conclusions and make independent decisions with limited information.
• Ability to respond to common inquiries from customers, staff, regulatory agencies, vendors, and other members of the business community.
• Self-motivated, reliable individual capable of working independently as well as part of a team.
• Possess the intellectual curiosity to work through ambiguity and have ability to work independently with minimal instruction.
• Ability to manage change and do extremely well in a fast-paced environment under time constraints.
• Ability to take initiative and work independently, and work effectively with individuals belonging to all levels within the organization.
• Ability to work with mathematical concepts such as probability and statistical inference.
• Ability to effectively and succinctly simplify complex data and communicate information to Senior Management.
• Customer service oriented and proactive in anticipating and resolving problems while maximizing efficient use of computing resources.
• Excellent interpersonal, oral and written communication skills
Subscribe to job alerts and upload your resume!
*By registering with our site, you agree to our
Terms and Privacy Policy.