at FOX Technology Center, LLC in Tempe, Arizona, United States
Job Description
Name of Employer: Fox Technology Center LLC
Job Title: Manager, Digital Technology and Metadata
Job Site: 2010 E. Centennial Circle, Tempe, AZ 85284
Job Duties:
Utilize background in the field of big data engineering to help define the transition from traditional broadcast to the video distribution systems that will shape the industry well into the 21st century. As part of a new team of technologists reshaping the industry and building the future for FOX, report to the Director of Technology Architecture and tightly aligned with the upstream and downstream architects, technology developers, and business analysts from other teams that are defining this new space. Specific duties include: 1) Keep custody of FOX Broadcast and operational data models through constant maintenance, insight gathering, and periodic evaluations. 2) Interact with external and internal metadata providers, upstream business stakeholders, downstream data analysts, and architecture teams. 3) Maintain data APIs throughout the ecosystem. 4) Guarantee quality of data aggregation and curation for downstream stakeholders. 5) Collaborate with technical teams in support of the data flows in the FOX playout systems. 6) Provide guidance and leadership to perform data governance, assist analytics team and provide solutions to the development team to implement data strategies, build data flows and develop data models that can be used for analytics purpose by different business domain. 7) Architect and approve robust strategies for continuous improvement in data management, including utilizing Cloud, AWS technologies to analyze and evaluate data into business assets that serve both the Business Intelligence team and business analysts throughout the organization; and utilize SQL, Big Data , Data Warehousing technologies to automate processes, including designing, developing, and maintaining data aggregation and summarization jobs in a Unix/Linux environment in accordance with the quality accredited standards. 8) Be involved in the development and design of ETL methodology designing various source to target mappings for supporting data transformations and processing, in a corporate wide ETL Solution in cloud. 9) Develop and manage stable scalable data pipelines that cleanse, structure and integrate disparate big data assets using different databases in combinations for data extraction and loading, joining data extracted from different databases and loading to a specific database. 10) Provide support to continuous integration development using Jenkins, Docket, Git, etc.