- Hands on experience leading large-scale global data warehousing and analytics projects.
- Demonstrated ability to think strategically about business, product, and technical challenges in an global environment.
- Design, review and deliver fool-proof frameworks for Big data handling for Data analytics platform at a petabyte scale.
- Own and develop data pipelines using cutting edge technologies in Big data and Cloud eco systems.
- Develop interactive dashboards, reports, and analysis templates
- Provide accurate estimations for workloads and own delivery of the same.
- Lead and deliver optimization and performance tuning processes with regards to Big data handling.
- Lay down guidelines and SOPs to the team for cloud cost reduction and optimization.
- Review and plan deployments in an agile fashion and be easily adaptable to changing priorities and timelines.
- Strong communication skills including representing your company in industry standards organizations or industry technical forums or events in Cloud Security
- Strong technical team leadership, mentorship and collaboration.
- Minimum 12+ years’ experience in programming and application development with atleast 3+ years of experience in leading Big data engineering projects.
- Extremely proficient in Java/Python/Scala programming languages and deep knowledge in data structures and algorithm.
- Excellent SQL skills and extensive experience in RDBMS/NOSQL DB concepts.
- Good knowledge in one or more web frameworks like ReactJS, AngularJS , Django Etc.,
- Strong knowledge and experience in 1 or more tools/frameworks from each category:
- Query/Data Processing engines: Spark, Hive, Athena/ Presto
- Data Warehouses: Bigquery, Redshift, Druid, Snowflake
- AWS : S3, Glue, EMR, RDS (or)
- GCP : GCS, Compute Engine, Bigquery, Dataflow
- Message queuing: Apache Kafka, AWS Kinesis, Pub/Sub
- Orchestration: Luigi, Airflow, Azkaban etc.
- Basic knowledge of software architecture design, docker, and microservices concepts.
- Very good Leadership skills and highly capable of owning the delivery of projects with a mixed bag of engineers and skillsets
Nice to Have Skills
- Good knowledge in Kubernetes,
- Experience with operational aspects of platforms such as monitoring and alerts management.
Agilisium is a AWS technology Advanced Consulting Partner that enables companies to accelerate their “Data-to-Insights-Leap”. With $25+ million in annual revenue and over 40% year-over-year growth, Agilisium is one of the fastest growing IT solution provider in Southern California.
Our most important asset? People. In fact, talent management plays an important role in our business strategy. We’re looking for “drivers”; big thinkers with growth and strategic mindset — people who are committed to customer obsession, aren’t afraid to experiment with new ideas. And, we are all about finding and nurturing individuals who are ready to do great work. At Agilisium, you’ll collaborate with great minds while being challenged to meet and exceed your potential.
|Job Category||Big Data, Data warehousing and analytics, Java, Python|