This job board retrieves part of its jobs from: Emploi Étudiant | Toronto Jobs | Work From Home

Tech jobs every day in Canada!

To post a job, login or create an account |  Post a Job

   Canadian Tech jobs   

Website updated every day with fresh jobs in the IT industry

previous arrow
next arrow

Azure BI Consultant (Azure Storage, Data Lake, SQL DB, Python, Data Factory) Inc

This is a Contract position in St Catharines, ON posted November 17, 2020.

“The start date of this job is flexible and in order to find the best candidate the job may remain open much longer than the date mentioned on this posting…or you may be asked to start sooner than the expected start date if you are able to do so” URGENT: Please read the job description below.

If this interests you, please send a MS-Word version copy of your updated resume (ASAP) along with your salary expectations, first available date and a telephone contact number Please mention the job title above in the subject line The recruiter in charge of this role is Dhruv If you do not think you are a match for this opportunity, but know someone who is, feel free to forward this email to them and you will be eligible for a referral bonus upon a successful hire.

One of our government clients is looking for an Azure BI Consultant (Azure Storage, Data Lake, SQL DB, Python, Data Factory) Length: Contract term is 8 months with a possibility of extension Location: St.

Catharines, ON Role: Ontario Government requires a resource to join their team and assist with a data set solution moving data pipeline from an operational database to a self serve reporting environment.

They will join a team that is currently doing similar work and follow the architecture in place.

Responsibilities/Assignment Deliverables: The consultant will design, implement, and perform required knowledge transfer of the following deliverables:
– Data Ingestion
– Design and implementation of raw data storage and ingestion mechanism
– Build Data Pipelines to ingest raw transactional data from the source system
– Data Pipelines and Semantic Model:
– To be performed in logical iterations
– Data Pipelines and Semantic Model will be based on set requirements
– Augment Data Pipelines to transform and move raw data into the Semantic Model
– Semantic modelling will determine how data structures will be available, combined, processed, pre-calculated and stored
– Knowledge Transfer Sessions and documentation for technical staff related to designing and implementing the above stated end to end analytics solutions Must Haves:
– Demonstrated experience with Azure Storage, Azure Data Lake, Azure SQL DB, Azure Synapse and Azure Analysis Service structures
– Demonstrated experience with automating data pipelines using appropriate Microsoft Azure Platform/Technologies (Python, Databricks, and Azure Data Factory)
– Demonstrated experience with Power BI reports and dashboards a strong asset
– Demonstrated experience conducting knowledge transfer sessions and building documentation for technical staff related to designing and implementing end to end analytics solutions
– Resource needs to be able to work in a team environment collaboratively and maintain a positive attitude Please note that this is the most up to date version of job description available at this time
– During Client Interview you will receive additional information
– variance may apply Please visit the TEAMRECRUITER website to review other CAREER OPPORTUNITIES