Data Architect

United States - Remote

Applications have closed

Rackspace

As a cloud computing services pioneer, we deliver proven multicloud solutions across your apps, data, and security. Maximize the benefits of modern cloud.

View company page

Our Data Architects are experienced technologists with technical depth and breadth, along with strong interpersonal skills. In this role, you will work directly with customers and our team to help enable innovation through continuous, hands-on, deployment across technology stacks. They will serve as the Data SME for our customers and be the focal touchpoint in engagements.   If you get a thrill working with cutting-edge technology and love to help solve customers’ problems, we’d love to hear from you. It’s time to rethink the possible. Are you ready?

What You’ll Be Doing:

  • Defining, communicating and implementing data architecture strategy and infrastructure 
  • Owner of our Enterprise data strategy with ability and desire to roll up sleeves technically when necessary 
  • Work with Analysts and Business Users to translate functional specifications into data processes and models 
  • Identify source system data sources and any necessary rules for transformation, consolidation, merging, cleansing 
  • Strong knowledge of replication processes, change data capture processes, T-SQL and Dynamic SQL skills 
  • Strong knowledge of data architectures to support reporting and data feeds (star schemas, slowly changing dimensions, etc.) 
  • Experience with migrating from on-prem data platforms to cloud platforms (specifically Azure and GCP Date Products) 
  • Understanding of Big Data storage and integration architectures (e.g., streaming & near-real time data pipelines) 
  • Strong expertise in at least 2 of the 3 hyper scalers with a focused expertise around Data Analytics.  
  • Willingness to learn and champion new technologies that support business capabilities 
  • Eager to explore how machine learning and artificial intelligence can support business objectives 
  • Ability to lead and coordinate an offshore team 

The Skills You Bring:

  • Experience across all phases of data modeling, from conceptualization to database optimization 
  • Experience with Data Warehousing (data lakes, operational data stores, integration hubs, etc.) and OLTP database management systems 
  • Deep experience in evaluating data flows and data models across the company, identifying improvement opportunities, and being able to build a plan to implement change 
  • Solid understanding of concepts of cloud computing; Azure and GCP Data Architecture  
  • Deep experience with Data Architecture / Data Modeling, both Dimensional and Relational 
  • History of working successfully with cross-functional engineering teams 
  • Passion for acquiring, analyzing, and transforming data to generate insights 
  • Strong organizational and interpersonal skills; Excellent verbal and written communication skills 
  • Must be able to effectively handle multiple deliverables simultaneously in a deadline driven environment 

Qualifications & Experience: The expertise You Have:

  • Working knowledge of reporting applications; Tableau, Power BI, Microsoft SSRS preferred 
  • Working knowledge of data stream-processing systems and API Web Services concepts 
  • 15+ Years of Datawarehouse and Analytic system development and deployment experience 
  • 10+ years of experience in database architectures and data pipeline development 
  • 8+ years of experience in modern data ware housing platform using cloud native technologies 
  • 6+ years of experience in delivering Azure/GCP/AWS Data Solutions. 
  • Hands-On experience with documentation frameworks to illustrate data models, data flows, and deliverables that track the data flowing into our platforms 
  • Demonstrated knowledge of software development tools and methodologies 
  • Presentation skills with a high degree of comfort speaking with executives, IT management, and developers 
  • Excellent communication skills with an ability to right level conversations 
  • Technical degree required; Computer Science or Math background desired 
  • Demonstrated ability to adapt to new technologies and learn quickly 
  • Experience with Google Cloud Services such as Streaming + Batch, BigQuery, BigTable, DataStudio, DataPrep, Pub/Sub , Cloud Storage, Cloud Dataflow, Data Proc, DataFlow, DFunc, Big Query & Big Table 
  • knowledge and proven use of contemporary data mining, cloud computing and data management tools including but not limited to Microsoft Azure, AWS Cloud, Google Cloud, hadoop, HDFS, MapR and spark. 
  • Design and configuration of data movement, streaming and transformation (ETL) technologies such as Informatica, Nifi, Kafka, Storm, Sqoop, SSIS, Alteryx, Pentaho, Alooma, Airflow. 
  • Creation of descriptive, predictive and prescriptive analytics solutions using Azure Stream Analytics, Azure Analysis Services, Data Lake Analytics, HDInsight, HDP, Spark, Databricks, MapReduce, Pig, Hive, Tez, SSAS. 
  • Design and configuration of data movement, streaming and transformation (ETL) technologies such as Azure Data Factory, HDF, Nifi, Kafka, Storm, Sqoop, SSIS, LogicApps, Signiant, Aspera, Alteryx, Pentaho, Alooma, Airflow. 
  • Large scale design, implementation and operations of OLTP, OLAP, DW and NoSQL data storage technologies such as SQL Server, Azure SQL, Azure SQL DW, PostgreSQL, CosmosDB, RedisCache, Azure Data Lake Store, Azure Synapse Analytics, Azure Databricks, Hadoop, Hive, MySQL, Neo4j, Cassandra, HBase 
  • Experience working within an agile development process (Scrum, Kanban, etc) 
  • Expertise in data estate workloads like HDInsight, Hadoop, Cloudera, Spark, Python. 
  • Familiarity with CI/CD concepts 
  • Strong verbal and written communications skills are a must, as well as the ability to work effectively across internal and external organizations and virtual teams. 
  • Knowledge or hands-on experience with data visualization and/or data sciences. 
  • Ability to succeed in a fast paced, matrix structured environment 

Must have's:

  • Hands on experience with Azure/GCP projects. 
  • Cloud certifications such as GCP Professional Data Engineer or Microsoft Data / AI certifications. 
  • Technical degree required; Computer Science or Math background desired 

Location:

  • This is a virtual role
  • The candidate needs to be based in US or Canada
#LI-Remote#LI-VM

About Rackspace TechnologyWe are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future.  More on Rackspace TechnologyThough we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.  Position is available for remote work in the following states unless otherwise specified. Alabama, Arizona, Arkansas, California, Connecticut, Delaware, District of Columbia, Florida, Georgia, Idaho, Illinois, Indiana, Iowa, Kansas, Kentucky, Louisiana, Maine, Maryland, Massachusetts, Michigan, Minnesota, Mississippi, Missouri, Montana, Nebraska, Nevada, New Hampshire, New Jersey, New Mexico, New York, North Carolina, North Dakota, Ohio, Oklahoma, Oregon, Pennsylvania, Rhode Island, South Carolina, South Dakota, Tennessee, Texas, Utah, Vermont, Virginia, Washington, West Virginia, Wisconsin, Wyoming.

Tags: Agile Airflow APIs AWS Azure Big Data BigQuery Bigtable Cassandra CI/CD Computer Science Data Analytics Databricks Dataflow Data management Data Mining Data pipelines Data strategy Data visualization Data Warehousing Engineering ETL GCP Google Cloud Hadoop HBase HDFS Informatica Kafka Kanban Machine Learning Mathematics MySQL Neo4j NoSQL OLAP Pentaho Pipelines PostgreSQL Power BI Python Scrum Security Spark SQL SSIS Streaming Tableau T-SQL

Perks/benefits: Career development

Regions: Remote/Anywhere North America
Country: United States
Job stats:  16  1  0
Category: Architecture Jobs

More jobs like this

Explore more AI, ML, Data Science career opportunities

Find even more open roles in Artificial Intelligence (AI), Machine Learning (ML), Natural Language Processing (NLP), Computer Vision (CV), Data Engineering, Data Analytics, Big Data, and Data Science in general - ordered by popularity of job title or skills, toolset and products used - below.