Glenn Partners Staffing Solution


Go Back To Job Listing


Job ID1860
Job TitleData Engineer
Permanent/Contract Contract
Travel (%)0
Date Posted 11/12/2019
CityToronto
State/Province ON
CountryCanada
Job Description A bank in Downtown Toronto is seeking a Data Engineer for a 6 month contract. The DevOps/Data Engineer is responsible for supporting the design, development and delivery of Business Intelligence projects supporting enterprise wide initiatives utilizing Enterprise Data Lake and Hadoop technologies. Business Intelligence solutions include providing ad-hoc analytics, data discovery, data exploration and reporting used to generate insights and enable Bank’s decision makers to make informed decisions. The incumbent is also responsible for the preparation of technical designs for all solutions under his/her responsibility and ensuring that adequate and appropriate controls are implemented within such designs to guarantee the integrity and accuracy of information provided to clients • Project: This project is to automate a number data collection and analytical processes at the Bank. There is a current state process that analyzes stock trading activities performed by the equity traders at the Bank. This process is executed manually by the business each day involving manually downloading trading data from internal EDL (Enterprise Data Lake) Streaming Architecture using Juptyer Notebooks / Python script, checks for data integrity / completeness etc. Once the data set is validated it gets incorporated into a Tableau workbook for data visualization and published to a Tableau server for end users to review. The goal of the project will be to automate these types of processes • The successful candidate will join a dynamic team environment that will be utilizing new technology to automate and streamline processes efficiently. Typical Day in Role • Implement DevOps / automate data analytical processes utilizing EDL, Tableau, Jupyter Notebooks, Jenkins pipelines, Data Meer, Hadoop / Spark, Tableau Desktop / Server, BitBucket • Promote the use of data and information by ensuring that all stakeholders needs and requirements are actioned and implemented according to design; service all requests and set reasonable expectations for turn-around times for all levels of priority; • Design, develop, implement and maintain new reporting functionality and analytic applications to support multiple business units utilizing analytical tools such as Tableau, Power BI or Datameer to extract data from Enterprise Data Lake; • Conduct research and execute proof of concepts on various Business Intelligence products, services and standards utilizing Big Data technologies; • Prototyping and developing models in some or all of the following: SQL, Hadoop, Python, Spark, R or similar technologies. • Facilitate implementation of solutions and strategies for Business Intelligence projects based on Big Data, Hadoop and Enterprise Data Lake; • Design and develop solutions to convert traditional data warehousing applications into robust Big Data implementations utilizing Hadoop; • Experience with developing and support data solutions utilizing the SQL Server data platform including SSIS and ETL would be beneficial • Work together with other team members and end-users within Agile or Waterfall project management frameworks in order to create unique solutions that satisfy clients’ needs; • Support work environment that fosters a high-level employee satisfaction and engagement by sharing knowledge, collaborating, communicating and participating in team-oriented activities; •Implement, maintain and educate on new tools where applicable enable end users to enhance their output by properly utilizing these tools; •Identify data mining opportunities to facilitate new business insights; •Develop executive dashboards by utilizing various data visualization tools and techniques; •Handle multiple projects simultaneously while maintaining accuracy and timeliness; •Development of technical designs for new/enhanced reports and information extracts and ensuring that design is consistent with clients' requirements; •Constantly maintain and upgrade technical skills and business knowledge;
Job Requirements Must Have Skills: 1) 3 + years’ experience automating processes, updating loading and feeding dashboards (Tableau, or similar) 2) 3 + years’ configuring applications with Jenkins and Hadoop 3) 3 + years’ analytical programming with Python OR Korn OR Shell OR Bash OR Tableau 4) 3 + years’ experience working on continuous integration processes 5) excellent analytical and problem-solving skills Current process is done manually, they are looking for someone to run it automate their program and feeds for languages - Ability to work independently and interact effectively as a part of a team; - Excellent communication, interpersonal & negotiating skills; - Enthusiasm for making a positive impact to the Bank; Nice-To-Have Skills: - Experience with Google Cloud Platform / Diyotta would be a plus – nice to have (future road map) - Working experience with multidimensional databases (SQL Server Database) and reporting and analysis tools – data files from data lake - FI experience is a nice to have Degrees or certifications: • Bachelor's degree in a technical field such as computer science, computer engineering or related field required
     
Click here to upload your resume, and remember to list the Job ID(s).
Copyright © 2008 Glenn Partners Staffing Solution Inc. postmaster@glennpartners.com