Job Description
Job Description:
This position is responsible for developing, integrating, testing, and maintaining existing and new process flows relating to Master Data Management. Work with the Project Owner/Sponsor to confirm the project scope and goals, objectives and business justification, secure project resources (people and budget) and re-iterate the mandate for the project.
Required Job Qualifications
- 3-7 years SQL experience.
- Experience with scripting languages such as Python/Scala
- Creating, maintaining and updating key project documentation.
- Leading, monitoring and maintaining progress of the project to ensure delivery of goals within the agreed constraints of time, cost and quality.
- Continually assessing potential risks and issues, actively managing remediation and maintaining risk and issue logs, and contingency plans.
- Manage complex dependencies, rules and requirements for data sets.
- Communicate with business stakeholders and subject matter experts.
- Identify interdependencies between the various stakeholder groups to ensure all are aligned and risks are identified, mitigated and communicated
- Background in Healthcare data / Exposure to FACETS
- Data integration experience
- Ability to manage workload and priorities.
- Exposure to Cloud Environments – Azure, AWS, Google
- Comfortable with large data sets
- Understand Pipeline development, deployment and storage (GIT)
- Exposure to SDLC Methodology (Agile / Scrum / Iterative Development).
- Testing / ABC Controls / Logging
- Create and maintain documentation including data mappings, data dictionaries, and workflows
- Troubleshoot issues and performance tune jobs as needed
- Monitor and maintain the processes to ensure optimal performance and data accuracy
- Participate in code reviews, testing, and deployment of ETL jobs
- Coordination of platform maintenance, including change and release management (planned and emergency deployments, configuration, and patching)
- Real-time system monitoring (custom and off-the-shelf tools) and engineer and implement
Preferred Job Qualifications
- Extensive hands-on experience in designing, developing, and maintaining large data solutions on Big Data platform such as Hadoop eco-system.
- 3-7 Years Strong SQL skills are required, with knowledge of Hadoop and Spark structures.
Extensive knowledge on building and supporting ETL Code - 3-5 Years Experience with Master Data Management, Audit/Balance/Control, Change Capture, Testing, Data Profiling
Strong communication skills, with prior experience in requirements gathering and working with SMEs and Stakeholders at various leadership level. - Knowledge of Data Modeling, Data Warehousing and ETL security and permission schemes
Exposure to Big Data patterns and data formats. - Ability to support project and/or product teams on functional and technical design activities; acting as Subject Matter Expert (SME);
Sharp problem-solving, analytical and innovation skills - Strong oral/written communication skills with the ability to translate complex ideas into simple language for non-technicians
Qualifications:
- Bachelor’s degree in a relevant field such as information technology, or a related discipline