This job board retrieves part of its jobs from: Toronto Jobs | Emplois Montréal | IT Jobs Canada

Tech jobs every day in Canada!

To post a job, login or create an account |  Post a Job

   Canadian Tech jobs   

Website updated every day with fresh jobs in the IT industry

Slide 1
Slide 2
Slide 3
previous arrow
next arrow

Big Data Platform Engineering Specialist


This is a Contract position in Waterloo, ON posted July 30, 2022.

Worker Sub-Type:


 Job Description: 

The successful candidate, as part of the Big Data Infrastructure Platform Engineering team, will be working closely with a team of talented architects and engineers to deliver scalable, highly available, and cost efficient Big Data  platform solutions integrated in the corporate and BlackBerry services design in both the lab and in production.

The following are the main day-to-day responsibilities of the role.

  • Qualify and drive new solutions and related architectural initiatives to enrich the Big Data platform service catalogue.
  • Prototype and design the multiple layers of Big Data platform in the lab, including the HW, Firmware, OS, Hadoop, Kafka, integrated storage, HA and Data replication.
  • Develop and test deployment architectures, with varying scale, and document them for consistent use and efficient configuration management.
  • Conduct failure mode analysis, develop and test recovery procedures, and invoke failover and DR plans when required.
  • Deploy, upgrade and patch the Big Data systems in production and verify the platform health upon changes.
  • Define and maintain the Big Data platform key performance indices, system health thresholds, and known issues list to enable the NOC for monitoring and incident management.
  • Troubleshoot Hadoop/Kafka problems, and drive short and long term corrective actions, and platform improvement plans.
  • Automate deployment procedures and platform templates to increase the maintenance and operational efficiency.
  • Work closely with the services team and the Product development team to tune and scale the Big Data platform according to growth trends and projection plans.
  • Optimize the Big Data platform performance and system resources allocation to satisfy the application requirements and improve the cost model.

Essential Skills / Qualification

  • Hadoop (HDP 2.6.5 and newer)
  • Kafka
  • Yarn
  • Oozie
  • ZooKeeper
  • Hive
  • Solr
  • Ambari Views
  • Apache Ranger
  • HortonWorks Data Flow
  • Public Clouds
  • Scripting and automation

The minimum requirement for this position is 7 years of technical experience in Big Data design and support and a Bachelor degree in Computer Science/Engineering or equivalent. Strong interpersonal skills are critical success key for this role as well.


Scheduled Weekly Hours: