This job board retrieves part of its jobs from: Emploi Étudiant | Toronto Jobs | Work From Home

Tech jobs every day in Canada!

To post a job, login or create an account |  Post a Job

   Canadian Tech jobs   

Website updated every day with fresh jobs in the IT industry

previous arrow
next arrow
Slider

Data Engineer (SQL, Python, Google Cloud, ETL, Big Query)

Teamrecruiter.com Inc

This is a Contract position in Toronto, ON posted November 17, 2020.

“The start date of this job is flexible and in order to find the best candidate the job may remain open much longer than the date mentioned on this posting…or you may be asked to start sooner than the expected start date if you are able to do so” URGENT: Please read the job description below.

If this interests you, please send a MS-Word version copy of your updated resume (ASAP) along with your salary expectations, first available date and a telephone contact number Please mention the job title above in the subject line The recruiter in charge of this role is Dhruv If you do not think you are a match for this opportunity, but know someone who is, feel free to forward this email to them and you will be eligible for a referral bonus upon a successful hire.

One of our telecommunication clients is looking for Data Engineer (SQL, Python, Google Cloud, ETL, Big Query) Length: Contract term is 1 year with a possibility of extension Location: Toronto, ON ROLE: As a Data Engineer, you will be responsible for crafting, building and running the data driven applications which enable innovative, customer centric digital experiences for our customers.

Working on a Digital Transformation project with the Data and Analytics team This is a Digital Transformation for Data project Must have a high level of experience with SQL and ideally Python Large focus on Google Cloud Platform (GCP) for Data analytics.

Looking for someone with a pretty high level of understanding of Data and can be an owner, as well as communicate with Stakeholders on the project.

Needs to have an understanding and ideally experience with Data (Big Query, Dataflow, Airflow, Dataflow, and NiFi) and Big Data (AWZ, Azure, Hadoop) tools (, basic understanding of concepts and workings of Data Warehousing, Data Lake, OLAP, and OLTP Applications.

Ideally this person will have an understand and working knowledge of Data Visualization (Data Studio, Tableau, Domo, etc) and ETL flows SUMMARY: You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions.

As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers.

Our development team uses a range of technologies to get the job done: GCP, Big Query, Airflow, Dataflow, Composer and Ni-Fi to provide a modern, easy to use data pipeline.

You will be part of the team building a data pipeline to transfer the data from our enterprise data lake for enabling data analytics and AI use cases.

You are a fast learner, highly technical, hardworking data engineer looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice.

RESPONSIBILITIES:
– Learn new skills & advance your data development practice
– Design, develop, test, deploy, maintain and improve the analytics pipeline
– Assist in evaluating technology choices and rapidly test solutions
– Assist the outcome teams in understanding how to best measure their web properties
– Collaborate closely with multiple team in an agile environment MUST HAVES:
– SQL 4-5 Years
– Programming experience (Phython strongly preferred)
– 4 Years
– Experience with Google Cloud Platform (for Data Analytics)
– Experience with Digital Transformations
– Experience with any of the following: Big Query, Dataflow, Airflow, Dataflow, and NiFi
– A passion for data and analytics
– Experience & proficiency in SQL and at least one other modern programming language (Python, Java, etc)
– Interest and ability to learn new languages & technologies as needed
– Familiar with the execution sequence of ETL Flows
– Experience with GCP, Big Query, Airflow, Dataflow, Composer and Ni-Fi
– Basic understanding of data warehouse, data lake, OLAP and OLTP applications NICE TO HAVES
– Big Query Experience
– Experience with ETL Flows
– Experience with AWZ, Azure, Hadoop
– Data Visualization experience (Data Studio, Tableau, Domo, etc) Please note that this is the most up to date version of job description available at this time
– During Client Interview you will receive additional information
– variance may apply Please visit the TEAMRECRUITER website to review other CAREER OPPORTUNITIES