Please scroll down, To apply

Snowflake Technical Architect

hiring now

ClifyX, INC

2024-11-07 07:39:04

Job location South Plainfield, New Jersey, United States

Job type: fulltime

Job industry: Construction

Job description

Job Title: Snowflake Technical Architect (H1B Sponsorship Available)

Job Location: USA

Job Type: Full-Time, 100K

Sponsorship: New H1B sponsorship available for eligible candidates

Job Summary:

We are looking for an experienced Snowflake Technical Architect to lead the design and implementation of data warehousing and analytics solutions using Snowflake's cloud data platform. This position offers H1B sponsorship for qualified candidates. The ideal candidate will have extensive experience with Snowflake, data architecture, cloud platforms, and data governance, with a proven ability to lead complex data migration and transformation projects.

Key Responsibilities:

Snowflake Architecture Design: Architect and design scalable data warehousing and analytics solutions using Snowflake.

Data Integration and ETL Design: Lead the design of data integration pipelines, ETL processes, and data transformation workflows with Snowflake, integrating data from various source systems.

Cloud Data Migration: Develop and implement strategies for migrating on-premises or other cloud data systems into Snowflake.

Performance Optimization: Optimize Snowflake solutions for performance, scalability, security, and cost-efficiency.

Security and Governance: Implement best practices for data security, governance, and compliance in Snowflake, including role-based access control, data masking, and auditing.

Collaboration with Data Teams: Work with cross-functional teams, including data engineers, data scientists, and business stakeholders, to gather requirements and translate them into technical solutions.

Automation & Orchestration: Automate data pipeline workflows using orchestration tools and integration with third-party tools like Airflow, Matillion, or AWS Lambda.

Monitoring and Troubleshooting: Implement monitoring and troubleshooting mechanisms to ensure the ongoing health of the data ecosystem.

Documentation: Document architectural designs, data models, best practices, and standards.

Required Skills & Experience:

Snowflake Expertise: Proven experience in designing and implementing data solutions on Snowflake. Knowledge of Snowflake features such as virtual warehouses, data sharing, multi-cluster architecture, and time travel.

Data Architecture: Strong understanding of data modeling techniques (star schema, snowflake schema), data warehousing concepts, and cloud data platforms.

ETL/ELT Tools: Experience in designing ETL/ELT pipelines using tools like Informatica, Talend, Matillion, Apache Airflow, or dbt.

Cloud Platforms: Familiarity with one or more cloud platforms (AWS, Azure, Google Cloud) and their integration with Snowflake.

SQL Expertise: Advanced SQL skills with experience in optimizing queries for large datasets and implementing best practices in database performance tuning.

Security and Compliance: Understanding of data governance, data privacy, and security protocols (e.g., encryption, data masking, RBAC) in Snowflake.

Programming Skills: Experience with Python or Java for data integration and automation.

Collaboration & Leadership: Strong communication skills and the ability to work with technical and non-technical teams. Proven experience leading large-scale data projects.

DevOps and Automation: Familiarity with Infrastructure as Code (IaC) tools like Terraform or CloudFormation and DevOps pipelines.

Education and Certifications:

Bachelor's degree in Computer Science, Information Technology, or a related field.

Preferred Certifications:

SnowPro Core Certification (or higher)

AWS Certified Solutions Architect, Google Cloud Professional Data Engineer, or Azure Data Engineer Associate

Preferred Qualifications:

5+ years of experience in data architecture, with at least 2+ years working with Snowflake.

Hands-on experience with cloud data migration and cloud-native tools.

Experience with streaming data architectures (Kafka, Kinesis, etc.).

Familiarity with Agile/Scrum methodologies and working in DevOps environments.

Strong analytical, troubleshooting, and problem-solving skills.

Inform a friend!

<!– job description page –>

Nearby jobs

Scientist I Clark

Databricks Technical Architect South Plainfield

Mainframe Architect Jersey City

Top