RESPONSIBILITIES:
Kforce’s growing technology company in Houston, TX is seeking a few Remote Senior and Principal Data Engineers. We are working directly with the Hiring Manager on these exclusive search assignments. These positions are 100% Remote. Summary: The Senior and Principal Data Engineers will work with a diverse and skillful team consisting of software engineering, product development and management, data engineering, security operations, and systems engineering for the company. They will be part of a team to work to create, preserve, and improve well-functioning, well-designed, multi-platform systems, and products for both internal and external customers. The Senior and Principal Data Engineers will be working to build and maintain the data pipeline, build performant transformations of data for analysis and reporting, and utilize modern ELT (extract/load/transform) practices to move data into our Data Lake and Warehouse. Responsibilities:
- Senior and Principal Data Engineers will develop data pipelines in a consumer and producer pattern to move data in and out of the data lake and warehouse using a variety of technologies that include Apache NiFi, Apache Kafka, Matillion, and Python
Structure Data and Queries for Performance:
- You will partner with data analysts, data scientists, and business intelligence developers to model and sanitize data for algorithm development, reporting, performance, and cost management
Build and Maintain the Data Dictionary:
- Collaborate with other development engineers and use Nifi to run data processing, create pipeline through a docker container, connect to data sources, quality check and run SQL scripts, detect integrity issues
- Send the good data to Postgres database, performance looks good, handles errors along data pipeline
- Route data to Kafka, store
- Will work on data pipeline, data modeling, develop codes, writing SQL queries, and building warehouses
- Automate the process
Job Requirements:
REQUIREMENTS:
- BS degree or higher in Engineering, Computer Science, Data Science, or related field, or equivalent industry experience
- At least 5+ years of experience in the data warehouse space
- At least 5+ years of experience writing and analyzing SQL statements
- At least 5+ years of experience in schema design and dimensional data modeling
- At least 3+ years in ELT/ETL design, implementation, testing, and maintenance
- Experience with ETL, SQL and Snowflake
- Experience with large-scale cloud-based data warehouses like Snowflake, Google’s BigQuery, or AWS Redshift
- Experience with several mid-size (Terabytes, millions of transactions/day) data processing e projects
- Schema design and dimensional data modeling
- Strong ETL Tools such as- NiFi, ETL tool (SSIS, BI)
- Strong experience with Data pipeline is required
- Writing SQL code
- Scripting – python or similar for automation
- Any on prem data warehouse is fine, do not need cloud; Could be on prem or Snowflake, Google’s BigQuery, or AWS Redshift or on-prem Teradata, SQL warehouse (or Azure), etc.
- Currently using AWS
- Experience managing real-time and streaming data producers
- Experience leveraging APIs, databases, and streams as data producers
- Very good communication skills
Kforce is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status.