Data Scientist Co-op
At HFI, we are relentless advocates for individuals, their families, and our communities. We help low-income and disabled people gain access to the Social Security disability benefits, income and services they need to improve their quality of life for many years to come. It’s our calling and our life’s work and it’s a privilege and an honor to do what we do. If you are somebody who wants to transform the lives to the people around us and are passionate about helping others in need, then we would love to have you as part of our team.
Local Candidates only, this is NOT a permanent remote position.
The Data Scientist Co-op is responsible for processing structured and unstructured data, analyzing large amounts of raw information to find patterns that will help improve the company by building data products to extract valuable business insights. In this role, the Data Scientist should be highly analytical with knack for data analysis, computer programming using Python, Data Science, databases and Big Data. The individual should also demonstrate passing for Machine Learning and Research. Responsibilities include but not limited to:
ESSENTIAL FUNCTIONS & RESPONSIBILITIES:
Collect data from various sources such as web APIs to internal databases encoded in SQL.
Identify and analyze patterns in the large volume of data supporting HFI’s new business initiatives and the speed or sudden variations in data collection.
Organize and “wrangle” large datasets in order to get actionable insights from them.
Integrate and prepare large, varied datasets, architecting specialized database and computing environments.
Build predictive models using machine learning algorithms and programming knowledge to efficiently go through large datasets and apply treatments, filters and conditions to the data.
Apply exploratory data methods and predictive analysis to navigate a dataset and come out with some broad conclusions based on some initial appraisal.
Work closely with clients, data stewards and other teams to turn data into critical information and knowledge that can be used to make sound organizational decisions.
Responsible for modeling complex problems, discovering insights and identifying opportunities through the use of statistical, algorithmic, data mining and data visualization techniques.
Responsible for complex data models and object relational database mapping while producing complex reports.
Validate findings using an experimental and iterative approach.
Create meaningful data visualizations that communicate findings and relate them back to how insights create business impact using Tableau.
Responsible for the design, development and creation of Tableau reports based on business requirements.
Review and provide technical solutions to projects which may be in different stages of the development life cycle.
Responsible for creation and maintenance of Python, Java and SQL queries and routines.
Write ad-hoc queries based on schema knowledge for various reporting requirements.
Ensures that all deliverables are thoroughly documented.
Others duties may be assigned.
Proven experience as a Data Scientist.
Master’s Degree in Computer Science, Information Systems, Statistics, Applied Math or an equivalent combination of education and experience, Masters highly preferred.
Advanced programming knowledge with Python and SQL with hands-on experience.
Experience and expertise is Data Science discipline like machine learning using Python, data analysis, Big Data and data visualization with Tableau.
Previous experience with predictive modelling, machine learning and analyzing large amounts of data using frameworks like Keras, Tensorflow, Apache Spark, Hadoop etc.
Experience with the design and development of packages, code maintenance using version control tools like GitHub highly preferred.
Experience with Unix/Linux including basic commands and shell scripting.
Working Knowledge of SQL Loader & Import/Export utilities. ? Knowledge of Java with strong OOAD fundamentals.
In-depth knowledge of Tableau reporting tool.
Self-motivation and the ability to learn quickly.
Experience with cloud and database technologies like AWS, Google Cloud.
Working knowledge of job scheduling and monitoring tools.
Strong problem solving and analytical skills.
Experience working with multiple projects/tasks and the ability to prioritize.
Excellent written, communication and presentations skills.
WORKING CONDITIONS / WORK ENVIRONMENT:
Moderate noise level associated with open office work environment.
While performing the duties of this job, the employee is regularly required to talk or hear; stand, walk, sit, use hands to finger, handle or feel objects, and reach with hands and arms. The employee occasionally will lift and/or move up to 25 pounds.