DISH

Big Data Architect

US-CO-Englewood
Job ID
2015-33422
Category
Information Technology

Summary

DISH is a Fortune 200 company with more than $15 billion in annual revenue that continues to redefine the communications industry. Our legacy is innovation and a willingness to challenge the status quo, including reinventing ourselves. We disrupted the pay-TV industry in the mid-90s with the launch of the DISH satellite TV service, taking on some of the largest U.S. corporations in the process, and grew to be the fourth-largest pay-TV provider. We are doing it again with the first live, internet-delivered TV service – Sling TV – that bucks traditional pay-TV norms and gives consumers a truly new way to access and watch television.

 

Now we have our sights set on upending the wireless industry and unseating the entrenched incumbent carriers.

 

We are driven by curiosity, pride, adventure, and a desire to win – it’s in our DNA. We’re looking for people with boundless energy, intelligence, and an overwhelming need to achieve to join our team as we embark on the next chapter of our story.

 

Opportunity is here. We are DISH.

Job Duties and Responsibilities

The Big Data Development team processes over a billion incoming data records each day.  We use a Cloudera Hadoop cluster with 1000 CPUs and 1.5 Petabytes of data capacity to provide cutting edge solutions to virtually every line of business at Dish.  We are looking for a Java/Hadoop Architect/Developer to join our team here in Englewood, CO and help lead us to Best in Class performance.

 

Primary responsibilities fall into the following categories:

  • Ability to translate high-level business requirements into detailed design
  • Aptitude to identify, create, and use best practices and reusable elements
  • Ability to solve practical problems and deal with a variety of concrete variables in situations where only limited standardization exits
  • Strong desire to learn a variety of technologies and processes with a "can do" attitude
  • Experience guiding and mentoring 2-5 developers on various tasks
  • Ability to own a complete functional area - from analysis to design to development and complete support

 

#LI-EC1

 

Skills - Experience and Requirements

A successful Hadoop Developer/Architect will have the following:

  • 4 year college degree or equivalent experience, Bachelor of Science preferred and 6+ years of professional development experience and/or equivalent combination of education and work experience
  • Java / J2EE object oriented pattern based development experience is required
  • Experience with Cloudera Hadoop, Hive, Impala, Kafka, Informatica ETL, Data Prep Tools, Time Series data processing and Data Science techniques are a definite plus
  • Experience with XML, JSON, SQL, UNIX, Eclipse, IP Network Protocols and delivering projects in agile environment using SCRUM/XP methodologies
  • Worked in a fast paced environment with focus on test driven development and CI/CD
  • Source code management experience (Subversion, PVCS, Maven etc.); preferably web services development experience; PL SQL Development, Oracle / SQL Server

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed

Connect With Us!

Not ready to apply? Connect with us for general consideration.