Firm Name: Costco
Numbers of Jobs: Data Engineer
Education Need: Graduate
Job Hours: 8
Payment: $20-$30/Hours
What's Job City: Chicago
Job Details:
Costco IT, the third-largest retailer in the world with wholesale operations in fourteen countries, is in charge of ensuring the technical viability of the business. Despite our size and quick international growth, we continue to promote a family-oriented environment where employees can succeed. The fact that Forbes ranks Costco as the seventh best employer in the world is proof.
Costco's culture, which fosters a workplace unlike any other in the high-tech sector, is the secret to the company's success. The value that Costco places on its employees has been extensively covered in articles by a number of publications, including Bloomberg and Forbes. Our team and employees are given priority one. It has won numerous awards for these qualities, and Costco is well known for its generosity and dedication to the local community. The company encourages its staff to actively volunteer by sponsoring a variety of opportunities for charitable giving.
Visit the Costco Wholesale IT group right now. In a fast-paced, dynamic environment, the Costco IT department is undergoing exciting transformational projects. We're constructing a state-of-the-art retail space where you'll be surrounded by devoted and expert staff.
The Data Engineer - Costco Logistics BI is in charge of creating end-to-end data pipelines that power Costco Logistics reporting. Building and delivering automated data pipelines from numerous internal and external data sources is the main goal of this position. The data engineer will be accountable for planning, constructing, testing, and automating data pipelines that serve as the only source of truth for the entire organization in cooperation with product owners, engineering, and data platform teams.
If you want to be a part of one of the BEST companies "to work for" in the world, just apply and let your career be reimagined.
ROLE.
- makes data available for use by building and operationalizing data pipelines (Costco Logistics BI).
- Creates data pipelines in collaboration with data architects and data/BI engineers and offers ongoing advice for improving data storage, ingestion, quality control, and orchestration.
- uses IICS (Informatica Cloud) to create, develop, and implement ETL/ELT processes.
- We are able to deliver our data products and services more quickly and effectively thanks to the use of MySQL.
- enhances and hastens the delivery of our data products and services by utilizing Azure services like Azure SQL DW (Synapse), ADLS, Azure Event Hub, and Azure Data Factory.
- creates scalable data processing platforms for big data and NoSQL solution implementation to provide the organization with high-value insights.
- identifies, plans, and executes internal process enhancements, like automating manual procedures and enhancing data delivery.
- identifies ways to improve the standard, effectiveness, and consistency of data management.
- effectively communicates technical concepts to non-technical audiences both orally and in writing.
- performs peer reviews of other data engineers' work.
REQUIRED.
- a minimum of three years' experience designing and putting data pipelines for large, complicated datasets into practice.
- at least two years working with Informatica PowerCenter.
- Minimum two years' worth of experience with Informatica IICS.
- Working knowledge of big data and cloud technologies like ADLS, Azure Databricks, Spark, Azure Synapse, and Cosmos DB for at least two years.
- ETL, data warehousing, and data modeling expertise spanning at least two years is required.
- 2+ years of experience integrating data using ETL, Kafka, and other event- and message-based techniques.
- using Git and Azure DevOps for more than two years.
- a great deal of experience working with various data sources, such as XML, SQL, flat files (csv, delimited), and SQL Server databases.
- exceptionally well-developed SQL skills, including familiarity with relational databases, business data, and the ability to write complex SQL queries against a variety of data sources.
- a solid grasp of data warehousing, graph databases, relational databases, and no-SQL databases.
- capable of working in an environment where agile development happens quickly.
Recommended.
- certifications for Microsoft Azure and related technologies.
- understanding of utilizing agile software development methodologies to deliver data solutions.
- proficiency with scripting languages like Python or PowerShell.
- Being familiar with UC4 Job Scheduler is preferred.
- being familiar with the retail industry.
- effective communication skills, both in writing and verbally.
- A BA or BS in computer science, engineering, or a related field, plus relevant software or service experience.
The necessary documents.
- correspondence of cover.
- the curriculum vitae.
Please click here to read the Costco Applicant Privacy Notice if you are a California applicant.