Job Description |
A Bank in Downtown Toronto (WFH) is seeking Data Engineer - 4 for a contract of October 31st 2023.
• Business Group: Global Retail Data
• Project: Executing the data governance program at Bank by cataloging data, building and working with partners to build reusable data sets
Candidate Value Proposition
• The successful candidate will have the opportunity to join one of the top banks in Canada
Typical Day in Role
• Utilizing their Spanish abilities to execute and supports teams across International banking for development of metadata, data linage, data quality monitoring, data pipelines and cross walk tables for standardizing data, data sharing agreements and service level agreements.
• Champions a customer focused culture to deepen client relationships and leverage broader Bank relationships, systems and knowledge across International Banking.
• Conduct ETL, SQL and DB performance tuning, troubleshooting, support, and capacity estimation to ensure highest data quality standards with the business and technical SME’s.
• Helps with the process of profiling data and ETL logic for documenting end to end data flow and lineage from capture at source, to storage, to delivery and its business intent at each stage (i.e. capture, transformation, fragmentation, editing)
• Profile and analyze source data to determine the best reporting structures to build.
• Design and develop ETL pipelines using multiple sources of data in various formats according to business requirements.
• Conduct dimensional modelling, metadata management, data wrangling, data cleaning and conforming, and warehouse querying.
|
Job Requirements |
• 2-5+ years experience working as a Data Engineer or Technical Business Analyst.
• Required to be proficient in Spanish
• Strong proficiency including coding experience in SQL, Python and R
• Previous experience with Data modelling and warehousing
• Proficiency with relational databases (Oracle, DB2, Redshift, etc.) & proficiency with JSON, Hadoop and Hive data structures
Nice to Have Skills
• Experience with Big Data and Cloud Technologies (Hadoop, MinIO, Presto/Trino, Juipter Hub, AirFlow)
• API development experience
• Experience working in an Agile team environment
• Proficiency with GCP Google Cloud Platform
Soft Skills
• Strong communications skills both oral & written
• Self-started
• Team player
• Works well independently
Education
• Bachelor’s Degree in Computer Science, Engineering, Science, Business or related work experience
• GCP is a plus
Interview:
• 15 minutes video conference with HM
• 2-panel video conference interview with HM & Data Engineer (Technology & Behavior questions) 1 hour interview
Best vs Average
• The best candidate has all the must have skills and GCP experience with strong communication skills both oral & written with exposure to a variety of IT/Tech projects
|