Python Data Scientist - Research and Engineering
Skynet Software
Kitchener, ON

Python Data Scientist - Research and Engineering

BACKGROUND

MAX Insurance is a 160-year-old Property & Casualty insurance company, that provides Home, Farm, Commercial, and Auto insurance. The company was acquired by Chelsea Avondale in 2016, with a plan to technologically modernize its operations and transform its pricing with advanced scientific methodologies.

Chelsea Avondale acquired Max Insurance in 2016. Chelsea Avondale’s team consists of former Canadian bank executives/industry veterans, that have a background in scientific computing and the development of full-scale production-level systems.

Skynet Software is a sister company to Max Insurance & Chelsea Avondale that employs the team of scientists, research & engineering, and systems development staff that serves the overall group.

The objective for Max Insurance since being acquired by Chelsea Avondale is to “wipe clean” all systems of the company, and “start fresh” in the development of the Core Systems of an insurance company. This is a significant undertaking and a luxury most existing insurance companies don't have. The group is well underway in this process and is looking to add smart and brilliant physical and mathematical scientists, data and computer scientists to its team.

In practice, this complete redevelopment means the re-engineering of almost everything: every function and every interaction with the core systems of an insurance company. The Skynet Software team has been established to handle this responsibility.

JOB DESCRIPTION AND REQUIREMENTS

  • We are a Python Shop. You must know Python cold, and not as a simple recreational tool that you have experimented with. In addtion, you must know Pandas extraordinarily well: working with large datasets terabytes in size, developing customized functions, knowing how to speed up computation, etc.
  • The role involves much more than just Data Science. The development of our risk analytics ecosystem involves scientific modeling of natural hazards, curating risk data warehouses, systems engineering, data engineering, the development of DL/ML/AI middleware, and the guidance and oversight of front-end people. Less than 50% of this role involves data science. In particular, you must have a knack for building stochastic models for catastrophic events (e.g. wildfires, windstorm & climate, floods, etc.).
  • You are multi-talented with a deep intellectual curiosity. We are looking for scientific developers with a desire to immerse themselves in both software engineering, data science and mathematical modelling. The ideal candidate works quickly, independently and meticulously. You will have to think creatively to find the best solution for each problem in a limited time.
  • We are a NoSQL Shop.You must understand MongoDB or be able to learn it at an expert level quickly.
  • TensorFlow/Keras is a major asset.You have dealt with large numeric datasetsand can build fast analytics tools with the GPU. 10 million rows by 300 columns is a no-brainer for you.

WHAT YOU WILL LEARN FROM US:

  • The insurance business: Policies/Coverages/Claims, risk analysis, advanced modelling, all core systems to run an insurance company.
  • Risk modelling: Catastrophes, probabilistic risk analysis, application of ML/DL to the insurance world, including the redevelopment of most Actuarial scientific methods.
  • Software Engineering: The design, development in Python, and deployment of an entire scientific software engineering project to run an entire insurance company. Being involved in the combination of a backend/middleware/front-end rooted in risk modelling with ML/DL.
  • Become a World Class Talent: You will work with Actuaries, Catastrophe Modeling experts, ML/DL specialists, and Software/Systems Engineers, and many leaders in the business. Depending on your skills, aptitude and leadership capabilities you will build a team or advance to executive roles.

Further information on our Ecosystem Stack:

BACKEND

  • MongoDB for NoSQL backend.
  • ELTs used for Aggregations/Layering/Transformations via Map/Reduce etc. to create Risk Data Warehouses as well as Transactional Data Structures (“Data Warehouses or DWs”).
  • DWs will be stored in PyTables HDF5 files for highspeed. Eventually move to Hadoop based file structures. Understanding of the drag of dealing with large datasets will be essential.
  • Numerous APIs for external data sourcing, as well as static data ingestion through Geospatial KML files / SQL databases / JSON records.

MIDDLEWARE

  • Python Pandas / Dask is our primary framework for almost all engineering.
  • Use of “internal APIs” to create ecosystem modules and multiprocessing — > via PyZMQ
  • TensorFlow / Keras for GPU processing, and all DL/ML/AI to determine classification/clustering
  • Data visualization in matplotlib. Very advanced knowledge of customization is helpful (i.e. build your own Seaborn)

FRONT-END

  • Django web service at full scale. API centric to make front-ends quite naive and removing most logic from front-ends
  • We will hire a number of front-end engineers to build light-weight UIs in HTML/CSS/JS, mobile, APIs
  • Chatbots / NLP will be built to handle transaction ingestion, to further minimize the logic in front-ends

Educational Background Requirements

  • MSc/PhD required.

For More Information, Please Watch This Video:

  • http://maxinsurance.ca/price.html

How to Apply

We ask you to do three things to make it easier for us to consider you for this role:

  • Email your resume (no need for a cover letter) to recruiting @ chelseaavondale.com
  • Email your resume (no need for a cover letter) to gajan @ chelseaavondale.com
  • Email your resume (no need for a cover letter) to jan @ skynetsoftware.ca

Make the subject line of your email “Python Data Scientist - Research and Engineering”.

Job Type: Full-time

Experience:

  • Python: 1 year (Required)

Education:

  • Master's Degree (Required)