Horizontal Integration
Projects the candidate will be working on:
Joint Issue Management is a new endeavor taken on by to provide enterprise level visibility into all Cases and Case work items that are being logged and handled here.
Joint Issue Management will be utilizing Kafka streaming technologies to provide tapics that other applications within will be able to publish to and consume from in order to create a centralized routing and integration layer amongst all case handling applications.
While the data passes through these technologies it will need to get transferred to a snowflake database and that is where these
Datawarehouse engineers will come into play.
They will be key resources in helping develop and implement the technology necessary to garner all data from this streaming intermediary into Snowflake in order to produce critical reports at an enterprise level.
Functions may include:
Database architecture, engineering, design, optimization, security, and administration; as well as data modeling, big data development, Extract, Transform, and Load (ETL) development, storage engineering, data warehousing, data provisioning and other similar roles.
Responsibilities may include Platform-as-a-Service and Cloud solution with a focus on data stores and associated eco systems.
Duties may include management of design services, providing sizing and configuration assistance, ensuring strict data quality, and performing needs assessments.
Analyzes current business practices, processes and procedures as well as identifying future business opportunities for leveraging data storage and retrieval system capabilities.
Manages relationships with software and hardware vendors to understand the potential architectural impact of different vendor strategies and data acquisition.
May design schemas, write SQL or other data markup scripting and helps to support development of Analytics and Applications that build on top of data.
Selects, develops and evaluates personnel to ensure the efficient operation of the function.
Basic, structured, standard approach to work.
Undergraduate degree or equivalent experience.
Required experience
Snowflake certified developer, deep understanding of Cloud technology, Snowflake credits etc.
Hands-on experience for at least 2 years, in large scale data-warehouse
Migration experience from Teradata or Big data to Snowflake
Experience with Snowflake warehouse and developing applications on Snowflake, SnowSQL, Snowpipe, Javascript UDF & Stored procedures
Experience in creating and consuming data from platforms such as Web APIs, Kafka, and CDC
Experienced in consuming data from flat files from other Cloud environments
Proficient in one or more programming languages such as Java, Scala, JavaScript or Python
Proficient in SQL and complex queries
Experience in source data analysis, data profiling and debugging