• Learners
  • Education provider
  • Enterprises
  • Government
  • Associates
Learners
  • Access top educators’ courses right here right now
  • Affordable education, anytime-anywhere for Students & Professionals
  • One place for applying for on-campus & online programs
  • Your budget and your program
  • Earn Certificates, Diplomas and Degrees
  • Add wings to your career, stay up-to-date in your field
Universities, Schools, Colleges, Training Companies, Coaching and Teaching Faculty
Education providers
  • Buy, sale and deliver courses online here
  • No Capex, better ROI
  • Enroll candidates for online & on campus programs
  • Total solutions for education delivery management
  • Address challenges: Growing cost, Faculty, Scale, Relevance and Effectiveness
  • Run conferences
Enterprises
  • Total solution to manage Learning and Development
  • Address challenges: Growing cost, trainers, Scale, Relevance and Effectiveness
  • No Capex better ROI
  • Purchase best courses and put inside your LMS
  • Deliver interactive training to Employees, partners and customers
  • Freedom from Dead LMS and Black Box
Government
  • Eradicate the birth penalty
  • Better & affordable education for all
  • Equal opportunity for every one
  • Scalable vocational training infrastructure
  • Improve employability
  • Reduce the digital divide
Associates
  • Become associate or representative of best of the educators
  • Earn without any investment
  • Get the best training
  • Buy and sale educational programs
  • Enroll candidates for online & on campus programs
Skip available courses

Available courses

The surveillance departments of exchanges and security boards  need to monitor unusual activities in  financial markets. However, this can only happen post-facto. Also, there are far too many counters to monitor to meaningfully manage the task manually. The project aims at identifying potentially unhealthy market positions so that the same may be investigated or monitored for following 2 or 3 sessions.The idea is to identify OHLC patterns of upto 3 days that indicate over-bought or over-sold markets.

The type of OHLC could be - positive or negative, long or short, top heavy or bottom heavy or long legged (12 types). These could emerge into patterns such as three positive short days / a short day followed by a top heavy day. (There are a dozen patterns, time permitting, each team may implement 2 or 3 patterns)The events of interest could be sudden rise or fall from gradual or sharp rise or fall (8 events in total)

To enable participants to be able to implement ML algorithms using Sciket-Learn package in Python.  The course aims at hands on training in implementation of various ML tools in Python.

(The internal logic / mathematical derivation will not be discussed or taught. Only hands on training will be provided in the implementation)

Natural language processing (NLP) is a field of computer science, artificial intelligence, and linguistics concerned with the interactions between computers and human (natural) languages. As such, NLP is related to the area of human–computer interaction. Many challenges in NLP involve natural language understanding, that is, enabling computers to derive meaning from human or natural language input, and others involve natural language generation.

Significant growth in the volume and variety of data is due to the accumulation of unstructured text data—in fact, up to 80% of all your data is unstructured text data. Companies collect massive amounts of documents, emails, social media, and other text-based information to get to know their customers better, offer customized services, or comply with federal regulations. However, most of this data is unused and untouched.

Text analytics, through the use of natural language processing (NLP), holds the key to unlocking the business value within these vast data assets. In the era of big data, the right platform enables businesses to fully utilize their data lake and take advantage of the latest parallel text analytics and NLP algorithms. In such an environment, text analytics facilitates the integration of unstructured text data with structured data (e.g., customer transaction records) to derive deeper and more complete depictions of business operations and customers.

Course Content:

  • Corpus
  • Unstructured Data Cleansing
  • Unstructured Data Analysis
  • Tagging
  • Vocabulary Mapping
  • Hidden Markov Models
Hadoop administrator course provides participants with a comprehensive understanding of all the steps necessary to operate and maintain a Hadoop cluster using Apache Hadoop Distribution. From installation and configuration through load balancing, security, this course will provide hands-on preparation for the real-world challenges faced by Hadoop administrators. 

Python is a powerful, flexible, open-source language that is easy to learn, easy to use, and has powerful libraries for data manipulation and analysis. This is among the top three along with R and SAS for analytics application. With origination as an open source scripting language, Python usage has grown over time. Today, it sports libraries (numpy, scipy and matplotlib) and functions for almost any statistical operation / model building you may want to do. Since introduction of pandas, it has become very strong in operations on structured data. Its simple syntax is very accessible to programming novices, and will look familiar to anyone with experience in Matlab, C/C++, Java, or Visual Basic. Python has a unique combination of being both a capable general-purpose programming language as well as being easy to use for analytical and quantitative computing.

For over a decade, Python has been used in scientific computing and highly quantitative domains such as finance, oil and gas, physics, and signal processing. It has been used to improve Space Shuttle mission design, process images from the Hubble Space Telescope, and was instrumental in orchestrating the physics experiments which led to the discovery of the Higgs Boson (the so-called "God particle"). At the same time, Python has been used to build massively scalable web applications like Youtube, and has powered much of Google's internal infrastructure. Python is easy for analysts to learn and use, but powerful enough to tackle even the most difficult problems in virtually any domain. It integrates well with existing IT infrastructure, and is very platform independent. Among modern languages, its agility and the productivity of Python-based solutions is legendary. Companies of all sizes and in all areas - from the biggest investment banks to the smallest social/mobile web app startups - are using Python to run their business and manage their data.

The course on the Python programming language and standard library with a focus on applying Python to problems in scripting, data analysis, and systems programming. This course assumes no prior experience with Python, but does assume participants know how to program in another programming language. It is especially well-suited for programmers who want to know what Python is all about without extra fluff.

Course Content:

  • Strings and Lists
  • Loops
  • Functions
  • Tuples, Dictionaries and Files
  • Regular Expressions
  • NLTK

In this course the student will gain a broad understanding of modern computer programming. The student will acquire introductory skills in problem analysis, solution design, and program construction . Through practical programming activities, the student will gain an appreciation of the nature and history of computer programming.

Fundamentals to Data Science

There is much debate among scholars and practitioners about what data science is, and what it isn’t. Does it deal only with big data? What constitutes big data? Is data science really that new? How is it different from statistics and analytics?

One way to consider data science is as an evolutionary step in interdisciplinary fields like business analysis that incorporate computer science, modeling, statistics, analytics, and mathematics. Data science is the study of the generalizable extraction of knowledge from data, yet the key word is science. It incorporates varying elements and builds on techniques and theories from many fields, including signal processing, mathematics, probability models, machine learning, statistical learning, computer programming, data engineering, pattern recognition and learning, visualization, uncertainty modeling, data warehousing, and high performance computing with the goal of extracting meaning from data and creating data products.

From government, social networks and ecommerce sites to sensors, smart meters and mobile networks, data is being collected at an unprecedented speed and scale. Data science can put big data to use.

Factoid_Likes

Average number of “likes” and “comments” posted on facebook daily.

Factoid_Percent

Percentage of the world’s data that has been produced in the last two years.

Factoid_eComm

Projected volume of e-commerce transactions in 2016.

Data Science is not restricted to only big data, although the fact that data is scaling up makes big data an important aspect of data science.

Course Coverage:

  • Introduction to Data Science
    • Data  in Data science ecosystem
  • Data sets , Training , Testing data sets
    • Volume ,Variety,Velocity and Values
    • Structured , Unstructured Data, Text Data
    • Meta Data Modelling /KDM standard
    • MOF , KDD and KDDML
    • Datamining Group and PMML standard
    • Unified Modelling language - Meta Data modelling
    • Text Data  and Classification
    • Global Standards - UNSPSC , DBPedia
    • Datascience  solution using platform . Iaas , Paas and Saas based solution approach
  • Big Data  and Big Data Technology , Tools and Platform
    • Why Big Data ?
    • Hadoop Framework
    • Map Reduce
    • YARN
    • HDFS
  • Services attached with Hadoop Framework (Hbase , Hive , Zookeeper , Cassandra , MongoDB)
    • API and Its Integration Model
  • Data Science Platform
    • Virtual Infrastructure Platform and Public Cloud
    • AWS Elastic Map Reduce  Platform
    • Apache Spark , Spark SQL  , Apache Storm
    • Machine Learning Platform
    • API design and Model  for platform
    • Data Platform - MapR
  • Data Science Services Platform
    • Data set Design and Model  using UML Infrastructure and MOF
    • KDM and KDDML for Numeric Data
    • Content /Text Data  modeling
    • Text/Content Extraction
    • XML Pipeline for Text aggregation / transformation
    • Semantic Content , Annotation
    • OWL/RDF /RDF Graph standard
    • Ontology , Vocabulary , Linked Platform
    • UIMA and NLP for text analysis
  • Algorithm  and Machine Learning
    • Classification of Data - Support Vector Machine,
    • Clustering of Data - K means
    • Collaborative Filtering  & Recommendation Engine / easyrec engine
    • Business Intelligence Platform

What is Hadoop? 

Apache™ Hadoop® is an open source software project that enables distributed processing of large data sets across clusters of commodity servers. It is designed to scale up from a single server to thousands of machines, with very high degree of fault tolerance. Rather than relying on high-end hardware, the resiliency of these clusters comes from the software's ability to detect and handle failures at the application layer.

It is a data storage processing system that enables data storage, file sharing, data analytics etc. The technology is scalable & enables effective analysis from large unstructured data therefore adding value. With increasing role of social media and internet communication Hadoop is being largely used by various spectrum of companies ranging from Facebook to Yahoo. Other big users of Hadoop include Cloudera, Hortonworks, IBM, Amazon, Intel, Mapr, Microsoft. This technology facilitates its users to handle more data through enhanced storage capacity also enables data retrieval in case of hardware failure.

Hadoop as a solution is increasingly offering data retrieval and data security features. These features are getting better with time. This is leading to enhanced solution to database management systems (DBMS). Hadoop software is the highest growing market in comparison to hardware and services.

Who is Using Hadoop? 

With increasing role of social media and internet communication Hadoop is being largely used by various spectrum of companies ranging from Facebook to Yahoo. This technology facilitates its users to handle more data through enhanced storage capacity also enables data retrieval in case of hardware failure.

  • Yahoo ( One of the biggest user & more than 80% code contributor to Hadoop)
  • Facebook
  • Cloudera
  • Hortonworks
  • IBM
  • Intel
  • Mapr
  • Microsoft
  • Netflix 
  • Amazon
  • Adobe 
  • Ebay
  • Hulu
  • Twitter
  • Snapdeal
  • TataSky

Why use Hadoop?

Hadoop changes the economics and the dynamics of large-scale computing. Its impact can be boiled down to four salient characteristics. Hadoop enables a computing solution that is:

Scalable

 A cluster can be expanded by adding new servers or resources without having to move, reformat, or change the dependent analytic workflows or applications.

Cost effective

 Hadoop brings massively parallel computing to commodity servers. The result is a sizeable decrease in the cost per terabyte of storage, which in turn makes it affordable to model all your data.

Flexible

Hadoop is schema-less and can absorb any type of data, structured or not, from any number of sources. Data from multiple sources can be joined and aggregated in arbitrary ways enabling deeper analyses than any one system can provide.

Fault tolerant

 When you lose a node, the system redirects work to another location of the data and continues processing without missing a beat.

Hadoop architecture

Hadoop is composed of four core components—Hadoop Common, Hadoop Distributed File System (HDFS), MapReduce and YARN.

Hadoop for the Enterprise from IBM

IBM BigInsights brings the power of Hadoop to the enterprise, enhancing Hadoop by adding administrative, discovery, development, provisioning, security and support, along with best-in-class analytical capabilities. IBM® BigInsights™ for Apache™ Hadoop® is a complete solution for large-scale analytics. Explore the Big Data Anlytics using IBM Infosphre Big Insight taught by IBM Experts at Aegis. 

Sample Jobs in Hadoop

  • Hadoop Developer
  • Hadoop Architect
  • Hadoop Tester
  • Hadoop Administrator
  • Data Scientist

Companies Recruiting

  • IBM
  • Myntra
  • Snapdeal
  • HP
  • EMC
  • Cloudera

Hadoop Course Overview

Through lectures, hands-on exercises, case studies, and projects the students will explore the Hadoop ecosystem, learning topics such as:

  • What is Hadoop and the real-world problems it solves
  • Understand MapReduce concepts and how it works
  • Write MapReduce programs
  • Architect Hadoop-based applications
  • Understand Hadoop operations

Prerequisites and Requirements

  • Programming proficiency in Java or Python is required. Prior knowledge of Apache Hadoop is not required.
  • Primary language during lectures will be Java. However, assignments and projects completed in both Java and Python will be accepted.

Note: For students who do not have a programming background in Java or Python, additional readings or learning videos will be prescribed. The programming prerequisites will need to be completed within the first 2 weeks of the course.

Course Contents

  1. Introduction to Hadoop: Real-World Hadoop Applications and Use Cases, Hadoop Ecosystem & projects, Types of Hadoop Processing, Hadoop Distributions, Hadoop Installation.
  2. Hadoop MapReduce: Developing a MapReduce Application, MapReduce Internals, MapReduce I/O, Examples illustrating MapReduce Features.
  3. Hadoop Application Architectures: Data Modeling, Data Ingestion, and Data Processing in Hadoop.
  4. Hadoop Projects: Practical Tips for Hadoop Projects.
  5. Hadoop Operations: Planning a Hadoop Cluster, Installation and Configuration, Security, and Monitoring.
  6. Hadoop Case Studies: A series of Group Discussions and Student Project Presentations.

References

  1. Lecture notes will form the primary reference material for the course.
  2. Specific reading material, papers, and videos will be assigned during the course to augment the lecture notes.
  3. Reference Textbook: Hadoop: The Definitive Guide, 4th Edition, Tom White, O’Reilly Media, Inc.

Evaluation

  1. Homework Assignments: Students will be assigned 5 homework assignments during the course: 10%
  2. Quizzes: Best 2 out 3 unannounced quizzes will be counted towards the final grade: 10%
  3. Mid-Term Exam: 20%
  4. Final Exam: 40%
  5. Student Project: The students will individually, or optionally in teams of two, research, design, implement, and present a substantial term project: 20%

  Consult our Big Data Career Advisers +91 704 531 4371 /+91 981 900 8153 on how to add wings to your career

 RHAdoop

RHadoop is a collection of R packages that allow users to manage and analyze data with Hadoop. The packages have been tested (and always before a release) on recent releases of the Cloudera and Hortonworks Hadoop distributions and should have broad compatibility with open source Hadoop and mapR's distribution.

Three packages of note — rmr2, rhdfs, and rhbase — provide most of RHadoop’s functionality:

  • rmr2: The rmr2 package supports translation of the R language into Hadoop-compliant MapReduce jobs (producing efficient, low-level MapReduce code from higher-level R code).

  • rhdfs: The rhdfs package provides an R language API for file management over HDFS stores. Using rhdfs, users can read from HDFS stores to an R data frame (matrix), and similarly write data from these R matrices back into HDFS storage.

  • rhbase: rhbase packages provide an R language API as well, but their goal in life is to deal with database management for HBase stores, rather than HDFS files.

 

Natural language processing (NLP) is a field of computer science, artificial intelligence, and linguistics concerned with the interactions between computers and human (natural) languages. As such, NLP is related to the area of human–computer interaction. Many challenges in NLP involve natural language understanding, that is, enabling computers to derive meaning from human or natural language input, and others involve natural language generation.

Significant growth in the volume and variety of data is due to the accumulation of unstructured text data—in fact, up to 80% of all your data is unstructured text data. Companies collect massive amounts of documents, emails, social media, and other text-based information to get to know their customers better, offer customized services, or comply with federal regulations. However, most of this data is unused and untouched.

Text analytics, through the use of natural language processing (NLP), holds the key to unlocking the business value within these vast data assets. In the era of big data, the right platform enables businesses to fully utilize their data lake and take advantage of the latest parallel text analytics and NLP algorithms. In such an environment, text analytics facilitates the integration of unstructured text data with structured data (e.g., customer transaction records) to derive deeper and more complete depictions of business operations and customers.

Course Content:

  • Week 1 : Words
  • Week 2 : Sentences
  • Week 3 : Modeling of data
  • Week 4 : Word sense disambiguation & Usage in IRE
  • Week 5: Co-reference Resolution, Summarization
  • Week 6:  Question Answering Systems & Sentiment Analysis
  • Week 7 : Machine Translation and Usage in Speech Recognition

Grading: Projects & Assignments:

  • Assignments : 30% (5% for each assignment)
    • Usage of tools
      • Stemmer
      • Morphological Analyzer
      • Wordnet
      • Core NLP
      • Spell Correction
      • Regular Expressions
  • Projects : 70% (will be discussed in the class)

Dynamic changes and uncertainity in global economic environment create various challenges for business that necessitate firms to reinvent and manage sustained profitability. Interest rates, exchange rates, government policy and economic growth rate are the key factors that impact decision to invest in country, industry or sector and stocks.
Hedge funds, pension funds, trusts funds,  investment management and asset management Companies allocate portfolio of funds to countries and sectors based on their research and expertise of stock and sector prospects in delivering expected results.
These fund managers use benchmarks to provide attractive returns to investors in the funds. Retail and institutional investors based on various factors like risk profile, expected returns and income invest in various funds.
Stock markets indices like the Sensex, Nifty, Dow jones, Nasdaq, etc are used as benchmarks to compare performance of equity and mutual funds. There are some issues and challenges in picking right benchmark for respective funds.

Since portfolio churning happens in indices and of a fund, it becomes critical to choose right benchmarks.
Also with advancement of automated trading algo trading, high frequency trading, it becomes very complex to identify risks, churning portfolios and delivering expected results. It is in this context,  we are creating simple benchmarks that help all stakeholders to use effectively through simple automated method and also mobile application.

Visualisation

Visualisation is commonly known as Data visualisation is viewed by many disciplines as a modern equivalent of visual communication. It is not owned by any one field, but rather finds interpretation across many (e.g. it is viewed as a modern branch of descriptive statistics by some, but also as a grounded theory development tool by others). It involves the creation and study of the visual representation of data, meaning "information that has been abstracted in some schematic form, including attributes or variables for the units of information".

A primary goal of data visualization is to communicate information clearly and efficiently to users via the information graphics selected, such as tables and charts. Effective visualization helps users in analyzing and reasoning about data and evidence. It makes complex data more accessible, understandable and usable. Users may have particular analytical tasks, such as making comparisons or understanding causality, and the design principle of the graphic (i.e., showing comparisons or showing causality) follows the task. Tables are generally used where users will look-up a specific measure of a variable, while charts of various types are used to show patterns or relationships in the data for one or more variables.

Data visualization is both an art and a science. The rate at which data is generated has increased, driven by an increasingly information-based economy. Data created by internet activity and an expanding number of sensors in the environment, such as satellites and traffic cameras, are referred to as "Big Data". Processing, analyzing and communicating this data present a variety of ethical and analytical challenges for data visualization. The field of data science and practitioners called data scientists have emerged to help address this challenge.

Course Content:

  • Data Exploration in R                                 
  • Line Chart
  • Scatter Plot
  • Histogram
  • Boxplot     
  • ggplot       
  • Tree Maps
  • D3.js
  • InfoVis

the students will acquire the skill-set in using advanced statistical tools to solve complicated business problems or create insights of business which act as aid to take effective business decisions.
Pedagogy: Mixed methodology that includes lectures, case discussions, practical problems solving and involving students through virtual lab based practice sessions.

Identifying data problem in a real-world setting and developing the means to address it. Capstone projects can be research-oriented or design-oriented. Solutions are typically interactive, meaning the end product is something that can be implemented and used.

Goals for the Capstone experience include:

  • Choose a industry or business functional area
  • Define the problem or opportunity
  • Submit a synopsis
  • Discuss with project mentor/ faculties
  • Determine what techniques to use to master this problem or opportunity
  • Synthesize all aspects of the problem;  people, technology and information

Post Graduate Diploma in Computer Education is an PG Diploma for Graduates to face any task related to IT work. To grasp growing industrial works.

Information retrieval

It is the activity of obtaining information resources relevant to an information need from a collection of information resources. Searches can be based on metadata or on full-text (or other content-based) indexing.
Automated information retrieval systems are used to reduce what has been called "information overload". Many universities and public libraries use IR systems to provide access to books, journals and other documents. Web search engines are the most visible IR applications.

Overview :

An information retrieval process begins when a user enters a query into the system. Queries are formal statements of information needs, for example search strings in web search engines. In information retrieval a query does not uniquely identify a single object in the collection. Instead, several objects may match the query, perhaps with different degrees of relevancy. An object is an entity that is represented by information in a database. User queries are matched against the database information. Depending on the application the data objects may be, for example, text documents, images, audio, mind maps or videos. Often the documents themselves are not kept or stored directly in the IR system, but are instead represented in the system by document surrogates or metadata. Most IR systems compute a numeric score on how well each object in the database matches the query, and rank the objects according to this value. The top ranking objects are then shown to the user. The process may then be iterated if the user wishes to refine the query.

Course Curriculum

Week 1-2 (intro):

  • Introduction to Information Retrieval;
    • Example use cases;
    • Introduction to large web search and problems
    • Unstructured data
    • Boolean retrieval example

Week 3-4 (models and data processing ):

  • Unstructured data processing
  • Term weighting
  • Vector space modeling (cosine distance)

Week 4-5 (elasticsearch + project assignment):

  • Elasticsearch installing description
  • Hands on tutorial
  • Project assignment

Natural language processing (NLP) is a field of computer science, artificial intelligence, and linguistics concerned with the interactions between computers and human (natural) languages. As such, NLP is related to the area of human–computer interaction. Many challenges in NLP involve natural language understanding, that is, enabling computers to derive meaning from human or natural language input, and others involve natural language generation.

Significant growth in the volume and variety of data is due to the accumulation of unstructured text data—in fact, up to 80% of all your data is unstructured text data. Companies collect massive amounts of documents, emails, social media, and other text-based information to get to know their customers better, offer customized services, or comply with federal regulations. However, most of this data is unused and untouched.

Text analytics, through the use of natural language processing (NLP), holds the key to unlocking the business value within these vast data assets. In the era of big data, the right platform enables businesses to fully utilize their data lake and take advantage of the latest parallel text analytics and NLP algorithms. In such an environment, text analytics facilitates the integration of unstructured text data with structured data (e.g., customer transaction records) to derive deeper and more complete depictions of business operations and customers.

Course Content:

  • Corpus
  • Unstructured Data Cleansing
  • Unstructured Data Analysis
  • Tagging
  • Vocabulary Mapping
  • Hidden Markov Models
This course will be a continuation of the Introduction to machine learning and data mining course   containing in-depth coverage of  advanced methods in machine learning, as well as their underlying theory. 
At the end of each session, there will be R practical session which illustrates the   development of machine learning model using these algorithms.
In particular, the course will cover the following topics:
Support Vector Machines
Naive BAyes Classifier
Sequence Prediction
Neural Networks...

Project Title:

"Lets find-an application to find missing people"

Requirements:

  1. Knowledge of OpenCV with python
  2. Knowledge of face recognition algorithms such as (Principal component analysis, Linear discriminant analysis, etc)
  3. Knowledge of Artificial Neural Network and other supervised classifier

" An expert chat"

Requirements:

  1. Strong knowledge of NLP, Information retrieval, and question answer system
  2. Knowledge about NLTK tool kit with SCIKIT
  3. Knowledge about any web scripting knowledge (php will be preferred) 

Adobe Connect Demo

Adobe Connect_Test_Meeting

Fundamentals of Data Science

There is much debate among scholars and practitioners about what data science is, and what it isn’t. Does it deal only with big data? What constitutes big data? Is data science really that new? How is it different from statistics and analytics?

One way to consider data science is as an evolutionary step in interdisciplinary fields like business analysis that incorporate computer science, modeling, statistics, analytics, and mathematics. Data science is the study of the generalizable extraction of knowledge from data, yet the key word is science. It incorporates varying elements and builds on techniques and theories from many fields, including signal processing, mathematics, probability models, machine learning, statistical learning, computer programming, data engineering, pattern recognition and learning, visualization, uncertainty modeling, data warehousing, and high performance computing with the goal of extracting meaning from data and creating data products.

From government, social networks and ecommerce sites to sensors, smart meters and mobile networks, data is being collected at an unprecedented speed and scale. Data science can put big data to use.

Factoid_Likes

Average number of “likes” and “comments” posted on facebook daily.

Factoid_Percent

Percentage of the world’s data that has been produced in the last two years.

Factoid_eComm

Projected volume of e-commerce transactions in 2016.

Data Science is not restricted to only big data, although the fact that data is scaling up makes big data an important aspect of data science.

Online Weekend Executive Post Graduate Program (EPGP) in Data Science, Business Analytics & Big Data in association with IBM
A True Holistic Data Science Program


"There were 5 exabytes of information created between the dawn of civilization through 2003, but that much information is now created in every 2 days" 
- Eric Schmidt, Executive Chairman, Google

Every day, we create 2.5 quintillion bytes of data — so much that 90% of the data in the world today has been created in the last two years alone. This data comes from everywhere: sensors used to gather climate information, posts to social media sites, digital pictures and videos, purchase transaction records, and cell phone GPS signals to name a few. This data is big data and mainly constitutes the unstructured data. This Big Data offers challenge in term of storage and further analysis in rest of in real time. One can dig gold mine if we are able to make sense out of big data.

In a global and highly interconnected world we live in, the one greatest power is the right information. Data Science, Big Data and Business Analytics is thus the primary tool for any organisation, society, Govt. to seek for competitive advantage and optimisation of the existing ecosystem. A career in Business Analytics, Big Data and Data Science promise a lot of challenge, opportunity, and reward.

Modern Data Scientists are Supermen with Crystal Ball

Data Scientists are super men who tell stories out of the data, respective of the size of data: big or small. They answer questions like what’s happening. What will happen? What we should do now? These are commonly termed as BI, Predictive analytics and Prescriptive Analytics. They use statistics, Machine learning, NLP, R, Python, Hadoop, Spark to create Crystal Ball to predict future, solve business problem or find out the missing opportunities. Snapdeal, which has over 20 million subscribers and generates terabytes of data through the interactions that happen with customers in addition to a catalogue of over 5 million, churns 15 million data points (related data set like a consumer shopping on specific days for a particular thing) within two hours, using Hadoop. About 35% of its orders come from recommendation and personalized systems, and the conversion rate of such orders is 20-30% higher than normal orders, the company claims.

Post Graduate Program in Data Science, Business Analytics and Big Data  (PGP-DS-BA-BigData)

Post Graduate Program in Data Science, Business Analytics and Big Data  (PGP-DS-BA-BigData) is India’s first high end data science program designed and delivered by Aegis School of Business, Data Science & Telecommunication in association with IBM and to train the new generation of data-savvy professionals. This 11 months program provides you intensive hands-on training to develop the necessary and unique set of skills required for successful career in the fastest growing and intellectually stimulating fields of Data Science, Big Data, Business Analytics, Predictive Analytics, NLP, ML and Cognitive Computing. Students who earn a PGP develop deep quantitative capabilities and technical expertise and are equipped to create business and social value by extracting useful insights and applying it in a various industries by playing a role of Data Scientist, Big Data Adviser and modern Business Analyst.

This program is delivered at Centre for Excellence in Telecom Technology and Management (CETTM), MTNL's world class campus with state of art "IBM Business Analytics Lab" and "IBM Cloud Computing Lab" at Powai in Mumbai.

 

Sheetal Soni, Country Channel Manager, IBM India speaking on PGP in Business Analytics and Big Data in association with Aegis

Aegis School of Business, Data Science & Telecommunication 
Aegis is a leading School in the field of Telecom Management/Technology Management started in 2002 with support of Bharti Airtel to develop future telecom leaders. Aegis offers various programs in 25 countries to top executives from top IT/Telecom firms. In 2015 Aegis has joined hands with IBM to offer high end courses in the field of Business Analytics, Big Data, Cloud Computing and Mobility. Check about Aegis at www.aegis.edu.in  and www.bellaward.com

IBM

To meet growing client demands in today’s data intensive industries, IBM has established the world’s deepest and broadest portfolio of Big Data and Analytics technologies and solutions, spanning services, software, research and hardware. Fueling this is the fact that IBM has made a multi-billion dollar investment in big data and analytics — $24 billion since 2005 through both acquisitions and R&D — to extend its leadership position in this strategic market. IBM Named A Leader In IDC MarketScape For Business Analytics practice. 

Why PGP/MS in Data Science, Business Analytics and Big Data program? 

Aegis is a leader in Data Science, Business Analytics and Big Data. Read what makes Aegis PGP in Business Analytics & Big Data, First Data Science Program of India In association with IBM the best program in India?

Highlights of the Program

PGP program Certification from IBM

Certification from IBM, world leader in Big Data Analytics at the completion of the program.

Best Data Science curriculum

Program curriculum designed as per the Data Scientist’s skills and competencies framework with the help of leading Data Scientists. This program has wide rage of core and elective courses which no other institute aound the world offers. Check list of courses at bottom

Ranked Among Top Ten Business Anlytical Courses in India

Ranked among Top Ten programs for Business Anlaytics programs in India by Analytics Magzine. Check the ranking

True Data Science Program and NOT an MBA/PGDBM in Business Anlytics

This Program is core technology MS in Data Science program and NOT a inferior cousin of MBA/PGDBM programs in Business Analytics. This program lays down your foundation for serious Developer, Data Scientist Career or Big Data Advisory & consulting roles.

IBM Business Analytics Lab

 IBM has setup an IBM Business Analytics and IBM Cloud Computing Lab in the campus. The program is delivered by IBM subject matter experts and best Data Scientists from around the world.

Big Data Product Factory

Live Projects: work on live projects like Churn Predication, Call drop analysis from leading teleco; NLP projects, consumer analytics, sentiment analysis etc at Aegis Big Data Product Factory as part of program.

Exposure to wide range of Tech, Tools and S/W

 Hands on exposure on IBM DB2, IBM Cognos TM1, IBM Cognos Insight, IBM InfoSphere Big Insight, IBM Worklight, IBM BlueMix, R, Python, SAS, Hadoop, MapReduce, Spark, ElasticSearch, Tableau, EMR, AWS etc

World class infrastructure

On campus classes held at world class infrastructure at Aegis, CETTM, MTNL at Powai, Mumbai. 

CMC, Internship and Placement

Career Management Center (CMC) at Aegis facilitates all students excellent paid internship for 2 to 3 months with various companies which generally leads to final placement as role of Data Scientist, Manager Data Science, Business Analyst, Risk Analyst etc with companies like IBM, Persistent, HDFC, Loginext, Angelbrooking, Creditvidya, L&T infotech, Virtusa, Ixsight, Pentation, Clover Infotech, L & T Finance, Guppy Media Incorporation etc. This years PGP in Data Science, Business Analytics and Big placement record:

Fresher Lowest Package: 8.5 lacs
Fresher highest Package: 11 lacs 
Experience Candidates got over 100% hike on last Package. *For experience up to 7 to 8 years.
90% got Data Scientist Role
Paid internship for everyone for 2 to 3 months 
*Aegis does not provide any kind of job guarantee

Course Modules taught by IBM

IBM faculty will be teaching three courses: (i) Business Intelligence using Cognos BI (ii) Big Data Analytics using IBM InfoSphere Big Insight and (iii) Enterprise Performance Management using IBM Cognos TM1

Scholarships

Deans Scholarship for Women in Technology Leadership; Aegis Graham Bell Award Scholarship for Big Data Products; Data Science Scholar. Study Loan by HDFC Credila.

Globally acceptable Credit Structure

This program follows Globally acceptable 45 Credit Unit for master degree. In North America any master degree is min 36 Credit Unit. This program is spread over 9 months plus 2-3 months Internship or consulting assignment.

Skill for various industries & Functional areas

The curriculum caters to the skill requirements in various industries like Ecommerce, Telecom, Banks, Computer Services, Education, Healthcare, Insurance, Manufacturing, Retail and other industries. 

Diverse student profile

Aegis participants' are from around the world with experience ranging from 2 year up to 25 years from diverse industry and background.

DSC_3045

Center of Innovation

Due to Aegis's initiative of Aegis Graham Bell Awards for innovation in TIME (Telecom, Internet, Media, Edutainment) and SMAC (Social, Mobile, Analytics and Cloud), Aegis students get to know the best of the innovations and their innovators. Hon'ble Dr. Jitender Singh, Minister of State in the Prime Ministers Office; Department of Atomic Energy and Department of Space honored the winners of Aegis Graham Bell Awards 2015 for their breakthrough innovations. Check the winners under Analytics and Big Data categories. 

Delivered by Best Data Scientist

This program is delivered by best Data Scientist engaged in real-life Data Science and Big Data Analytics projects from around the world. Check profiles....

World Class LMS on Cloud

Aegis uses world class Learning Management System (LMS) on cloud called mUniversity(mUni). All lectures, tons of study material, ebooks, business cases, video lectures available on LMS. Check sample session of Mr. Srikanth Velamakanni, CEO, Fractal Analytics at mUni......

Big Data Day leadership Series

Aegis Big Data Day Meetup provides an opportunity to meet Data Science, Big Data, Analytics leaders. Register for latest Meetup

      

 Download Aegis Business Analytics and Big Data Brochure

Is Business Analytics, Big Data and Data Science for me?

If you enjoy working data in different ways, exploring the stories that data may reveal, and even experimenting with visualization techniques, you may enjoy a role in data science. Whether you have an academic background in business, economics, technology, statistics, mathematics, or engineering, you are looking for a career change that will prepare you to generate high impact. You search for a challenging role that allows you to integrate the value of data into all business functions. Data analysis is necessary in almost every field and industry and Business Analysts, Big Data professionals and Data Scientists are expected to play a role in guiding strategic decisions. They are in a unique position to assist leaders in determining the right questions to ask as well as in interpreting the answers that data provides to help determine the organization's most effective courses of action.

Job Opportunities 

While there is a large and growing demand for professionals trained to collect data and use analytics to drive performance, there is currently a deficit in the supply of these individuals in the global job market. The deficit in the market for data/ Big Data analytics professionals is growing exponentially and poses a serious challenge for recruiters, but it is an enviable opportunity for professionals who have the right skills and training to fill these positions.

Program Models:

Full Time Post Graduation Program (OnCampus)

  • Total Program: 36 Credit Units

  • Program Duration: 11 months

  • 9 months of Class room training + Live projects + One Capstone project or Data Product Development + 2 to 3 months of Internships

  • 3 terms each of 3 months

    View details and Apply

  •  

    Part Time Executive Program in Weekend model (OnCampus)

  • Total Program: 36 Credit Units

  • Program Duration: 11 Months

  • Sat and Sun on-campus classes in Powai spread over 9 months

  • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment 

    View details and Apply

  •  

    Part Time Executive Program in Hybrid model (OnCampus + Online)

  • Total Program: 36 Credit Units

  • Program Duration: 11 Months

  • One Block of 5 days in every 3 months plus Sat and Sun online classes spread over 11 months 

  • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment 

    View details and Apply 

  •           

    Part Time Executive Program in Online model (Online Live Interactive)

  • Total Program: 36 Credit Units

  • Program Duration: 11 Months

  • Live interactive Classes on Saturday and Sunday

  • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment      

    View details and Apply

  •    

     Download Aegis Business Analytics and Big Data Brochure

    Eligibility Criteria for Admissions: 

    - Working executive with bachelor's degree.

    Admission process:

    • Online application (click on Apply button on top of this page to register and apply)
    • Personal Interview
    • admission offer (if selected)

    Get in Touch with Aegis Big Data Career Adviser Team at  +91 704 531 4371/ +91 882 808 4908/ +91 902 213 7010
    You can also email your Resume to admission@aegis.edu.in and request a call back for consultation 

    Scholarships: 

    Dean’s Scholarship for Women in Technology Leadership

    Saluting the women leaders in Technology, Aegis has announced the “Dean’s Scholarship for Women in Technology Leadership”  

    Who should apply: 

    Aspiring Women Leaders. Please Check Eligibility Criteria above.  

    What's you get:

    • 100% scholarship to study Full Time PGP in Business Analytics and Big Data Program 
    • Learn and work on Big Data Product Factory
    • Come out as Data Scientist or Entrepreneur

    Data Science Scholar

    Who should apply:

    • Phd, Mtech, Msc, MS in Computer science/Physics/Maths or statistics/Economics
    • or If you have developed some data product 

    What's you get:

    • 100% scholarship to study Full Time PGP in Business Analytics and Big Data Program 
    • Learn and work on Big Data Product Factory
    • Job offer to work as Data Scientist ‎at Big Data Product Factory and Aegis ( this is optional )

    Aegis Graham Bell Award Scholarship for Big Data Products for startups

    What you get: 

    Scholarship 
    For Executive PGP in
    Business Analytics & Big Data in association with IBM

    Workspace
    At Aegis Big Data Product Factory
    in Mumbai.

    Manpower
    Access to Highly Skilled Big Data Manpower to work on your Product/Solution.

    Mentorship
    From Business Leaders, Data Scientists & Functional Heads

    Recognition
    Get Recognized at The Largest Innovation Award in TIME & SMAC
    'Aegis Graham Bell Awards'.

    Networking
    Opportunity to network with Business leaders, Functional Heads, Peers & VC's.

    POC
    Help in POC

    Funding
    Help in fund raising and introduction with VC's 

    Who should apply:

    • Start up firms 
    • Individual who have developed some data product 
    • If you have a break through idea on Big Data product/ solution/ platform and you want to develop it   
    • check also www.bellaward.com

    How to Apply for Scholarship: Click on the top apply button and submit application form. Admissions and Scholarship application is same. 

    Aegis School Of Business, Data Science & Telecommunication
    CETTM, MTNL, Technology Street, | Hiranandani Gardens, Powai, Mumbai – 400076, India
    Mahesh Block –B, Sector 15, Central Business District (CBD), Belapur, New Mumbai 400614, India
    www.aegis.edu.in 

    This course will be a continuation of the Introduction to machine learning and data mining course   containing in-depth coverage of  advanced methods in machine learning, as well as their underlying theory. 
    At the end of each session, there will be R practical session which illustrates the   development of machine learning model using these algorithms.
    In particular, the course will cover the following topics:
    Support Vector Machines
    Sequence Prediction

     

    Know about the game changer the International Business Machines Corporation (commonly referred to as IBM) What is this 100 year old giant doing now?

    You and even ur parents might have witnessed IBM has computers, cash registers and the like, but that was the IBM of old – what is the IBM of today and the IBM of the future? Why its called the game changer in the world of analytics

    Big Data and Analytics is very much at the center of this transformation and serves as the “silver lining” of IBM’s core business initiatives today. To meet growing client demands in today’s data intensive industries, IBM has established the world’s deepest and broadest portfolio of Big Data and Analytics technologies and solutions, spanning services, software, research and hardware.

    89% of business leaders believe Big Data will revolutionize business operations in the same way the Internet did. - Forbes

    Agenda:

    Timing

    Topic

    Speakers

    10:00 AM to 10:15 AM

    Key Note

    ​Dr. Abhijit Gangopadhyay, Dean, Aegis School of Business, Data Science and Telecom, Founding Dean, IIM Indore

    :00 AMRise of the Emerging Technologies & impact on learning

    Data Science – What , Why and Where it is going

    ​Mr. Bhupesh Daheria, CEO, Aegis School of Business, Data Science and Telecom

    ​11:00 AM to 11:45 AM

    Rise of the Emerging Technologies & impact on learning

    ​Ms. Sheetal Soni, Country Channel Manager - Career Education, IBM

    ​11:45 AM to 12:15 PM

    Predictive Analytics using SPSS Products

    ​Mr. Hrishikesh Pathak, Data Scientist, IBM India

     12:15 PM to 12:30 PM

    Demonstration of SPSS Modeler

    Mr. Hrishikesh Pathak, Data Scientist, IBM India

    ​12:30 PM to 1:00 PM

    IBM Infosphere BigInsight Demo

    Mr. Suhas Pote, Subject Matter Expert, Aegis School of Business, Data Science and Telecom

    1:00 PM to 2:00 PM

    Networking Lunch

    2:00 PM to 2:15 PM

    Machine Learning for Data Science

    Ms. Minta Thomas, Head Analytics, Aegis School of Business, Data Science and Telecom

    2:15 PM to 2:30 PM

    How Aegis can help you to launch your career in Data Science and Analytics

    Mr. Ritin Joshi, Head Admission, Aegis School of Business, Data Science and Telecom

    2:30 PM to 3:00 PM

    Softcomputing in new roadway to Intelligence System

    Mr. Avinash Kumar Singh, Project Manager, eClerx

    ​3:00 PM to 4:00 PM

    ​IBM Case studies

    Mr. Tushar Kale, Big Data Evangelist, IBM

    4:00 PM to 4:15 PM

    Tea Break

    ​4:15 PM to 5:00 PM

    Discussion on Cognitive Analytics & Demonstration of Watson Analytics

    Mr. Hrishikesh Pathak, Data Scientist, IBM India

      Saturday, April 2, 2016 10 to 

      Venue : Aegis School of Business, Seminar Hall No.315, CETTM, MTNL, Near Hiranandani Grden, Powai, Mumbai (map)

      Outside participants can also watch it online at muniversity portal. Once you register you can access live Meetup link inside the dashboard. 

      For Registration apply on top button

      For any queries reach us on: 8828084908

      For any difficulty in accessing the session please get in touch with Purvesh Mhatre at purvesh.m@aegis.edu.in or call +91 8767662874

      Aegis School of Data Science PGP BIG Data Analytics

      Application Development and Deployment for Cloud using IBM BlueMix

      Cloud computing - the deployment of network-based applications in a highly flexible, shared IT environment - is becoming a key enabler of better service delivery and greater value in today's business landscape. It offers a number of major advantages over more traditional application deployment models, including the more efficient use of IT and development resources, easier and less costly maintenance, and the ability to deliver consistent services through a variety of channels. Cloud computing also makes it easier for businesses to partner and bring enhanced composite service offerings to market very quickly. But developing cloud-based applications requires new approaches that address the unique requirements of software-as-a-service. IBM Application Development Services for Cloud delivers on the promise of cloud application development by building custom cloud applications from implementation planning to design, development and deployment.

      About the course 
      IBM BlueMix, its Platform-as-a-Service (PaaS) to help developers quickly integrate applications and speed deployment of new cloud services. Built on open standards and leveraging Cloud Foundry, IBM BlueMix has made major advancements since being announced in 2014, including: 

      – Cloud integration services to enable a secure connection between an organization's public applications and private data

      – Swift, secure and fast connections, via the cloud, between Internet of Things and machine to machine device to store, query and visualize data

      – Data and analytics-as-a-service to allow the rapid design and scale of applications that turn Big Data into competitive intelligence

      – DevOps services that support the complete developer lifecycle

      IBM BlueMix combining the strength of IBM's middleware software with third-party and open technologies to create an integrated development experience in the cloud. Using an open cloud environment, IBM BlueMix  helps both born-on-the-web and enterprise developers build applications with their choice of tools. With the ability to easily combine a variety of services and APIs from across the industry to compose, test and scale custom applications, developers will be able to cut deployment time from months to minutes. This course is designed to teach you how to build applications for deployment to the cloud platforms using BlueMix.

      Pre-Requisites 
      To benefit from this course, participants should have the following skills or experience:

      – Programming using Java
      – Familiarity with Java EE for Web development (HTML, JSPs, and Servlets)
      – Experience using relational databases
      – The design-implement-test-deploy stages of application development 
      – Object-oriented design and analysis

      Course Contents 
      This course covers the following topics:

      Introduction to Bluemix

      • Cloud computing overview
      • Consumption View – IaaS (Soft Layer), PaaS (IBM Bluemix),
      • Blue Mix Architecture
      • Blue Mix Overview and Dashboard
      • Setup and installations - Eclipse and CF plugins
      • Lab Exercise: Building an Application from a Boilerplate in the
      • Bluemix UI
      • Deploying a Java web app that uses the PostGreSQL service with the IBM Bluemix Eclipse tools
      • Lab Exercise : Building and Deploying the Java version with the
      • IBM Bluemix Eclipse tools

      Development of Apps using Bluemix Services

      • Registering Services in Bluemix
      • Lab Exercise : Deploying a Node.js app that uses the MySQL service with command line tools
      • Lab Exercise : Build a Twitter Influencer Application in Bluemix
      • Bluemix Eclipse tools

      Development of Apps using Dev Ops Services on Blue Mix

      • Overview of Dev Ops
      • Overview Bluemix DevOps Services
      • Lab Exercise :
      • DevOps
      • Lab Exercise: JEE Cloud Trader Benchmark Application on
      • Bluemix that use performance analysis capabilities

      Bluemix Services in Mobility & Big Data

      • Overview of Services in the areas of Mobile Apps Development & Big Data
      • Lab Exercise: Building an Application with Mobile Backend as a
      • Building and Deploying the Node.js version with the IBM
      • Deploying the Python version with command line tools
      • Part 1: Importing and deploying the application from
      • Part 2: (Optional) Updating the application
      • IBM Career Education Copyright 2014
      • Service (MBaaS) on Bluemix platform
      • Lab Exercise : Data Management service - Build an BI application using Map Reduce Service to perform analytics for Big Data Sets

      Roadmap 
      Participants who wish to learn further about Cloud Computing should take the following courses:

      - Advance Application Development and Deployment for Cloud using IBM Bluemix
      - About Cloud Security

      For Admissions:

      Please send your Resume to admission@aegis.edu.in 
      Get in Touch with Ritin Joshi  at 9022137010

      Fundamentals of Data Science

      There is much debate among scholars and practitioners about what data science is, and what it isn’t. Does it deal only with big data? What constitutes big data? Is data science really that new? How is it different from statistics and analytics?

      One way to consider data science is as an evolutionary step in interdisciplinary fields like business analysis that incorporate computer science, modeling, statistics, analytics, and mathematics. Data science is the study of the generalizable extraction of knowledge from data, yet the key word is science. It incorporates varying elements and builds on techniques and theories from many fields, including signal processing, mathematics, probability models, machine learning, statistical learning, computer programming, data engineering, pattern recognition and learning, visualization, uncertainty modeling, data warehousing, and high performance computing with the goal of extracting meaning from data and creating data products.

      From government, social networks and ecommerce sites to sensors, smart meters and mobile networks, data is being collected at an unprecedented speed and scale. Data science can put big data to use.

      Factoid_Likes

      Average number of “likes” and “comments” posted on facebook daily.

      Factoid_Percent

      Percentage of the world’s data that has been produced in the last two years.

      Factoid_eComm

      Projected volume of e-commerce transactions in 2016.

      Data Science is not restricted to only big data, although the fact that data is scaling up makes big data an important aspect of data science.

      What is Hadoop

      Apache™ Hadoop® is an open source software project that enables distributed processing of large data sets across clusters of commodity servers. It is designed to scale up from a single server to thousands of machines, with very high degree of fault tolerance. Rather than relying on high-end hardware, the resiliency of these clusters comes from the software's ability to detect and handle failures at the application layer.

      It is a data storage processing system that enables data storage, file sharing, data analytics etc. The technology is scalable & enables effective analysis from large unstructured data therefore adding value. With increasing role of social media and internet communication Hadoop is being largely used by various spectrum of companies ranging from Facebook to Yahoo. Other big users of Hadoop include Cloudera, Hortonworks, IBM, Amazon, Intel, Mapr, Microsoft. This technology facilitates its users to handle more data through enhanced storage capacity also enables data retrieval in case of hardware failure.

      Hadoop as a solution is increasingly offering data retrieval and data security features. These features are getting better with time. This is leading to enhanced solution to database management systems (DBMS). Hadoop software is the highest growing market in comparison to hardware and services.

      Who is Using Hadoop

      With increasing role of social media and internet communication Hadoop is being largely used by various spectrum of companies ranging from Facebook to Yahoo. This technology facilitates its users to handle more data through enhanced storage capacity also enables data retrieval in case of hardware failure.

      • Yahoo ( One of the biggest user & more than 80% code contributor to Hadoop)
      • Facebook
      • Cloudera
      • Hortonworks
      • IBM
      • Intel
      • Mapr
      • Microsoft
      • Netflix 
      • Amazon
      • Adobe 
      • Ebay
      • Hulu
      • Twitter
      • Snapdeal
      • TataSky

      Why use Hadoop?

      Hadoop changes the economics and the dynamics of large-scale computing. Its impact can be boiled down to four salient characteristics. Hadoop enables a computing solution that is:

      Scalable

       A cluster can be expanded by adding new servers or resources without having to move, reformat, or change the dependent analytic workflows or applications.

      Cost effective

       Hadoop brings massively parallel computing to commodity servers. The result is a sizeable decrease in the cost per terabyte of storage, which in turn makes it affordable to model all your data.

      Flexible

      Hadoop is schema-less and can absorb any type of data, structured or not, from any number of sources. Data from multiple sources can be joined and aggregated in arbitrary ways enabling deeper analyses than any one system can provide.

      Fault tolerant

       When you lose a node, the system redirects work to another location of the data and continues processing without missing a beat.

      Hadoop architecture

      Hadoop is composed of four core components—Hadoop Common, Hadoop Distributed File System (HDFS), MapReduce and YARN.

      Hadoop for the Enterprise from IBM

      IBM BigInsights brings the power of Hadoop to the enterprise, enhancing Hadoop by adding administrative, discovery, development, provisioning, security and support, along with best-in-class analytical capabilities. IBM® BigInsights™ for Apache™ Hadoop® is a complete solution for large-scale analytics. Explore the Big Data Anlytics using IBM Infosphre Big Insight taught by IBM Experts at Aegis. 

      Sample Jobs in Hadoop

      Companies Recruiting

      • IBM
      • Myntra
      • Snapdeal
      • HP
      • EMC
      • Cloudera

      Hadoop Course Overview

      Through lectures, hands-on exercises, case studies, and projects the students will explore the Hadoop ecosystem, learning topics such as:

      • What is Hadoop and the real-world problems it solves
      • Understand MapReduce concepts and how it works
      • Write MapReduce programs
      • Architect Hadoop-based applications
      • Understand Hadoop operations

      Prerequisites and Requirements

      • Programming proficiency in Java or Python is required. Prior knowledge of Apache Hadoop is not required.
      • Primary language during lectures will be Java. However, assignments and projects completed in both Java and Python will be accepted.

      Note: For students who do not have a programming background in Java or Python, additional readings or learning videos will be prescribed. The programming prerequisites will need to be completed within the first 2 weeks of the course.

      Course Contents

      1. Introduction to Hadoop: Real-World Hadoop Applications and Use Cases, Hadoop Ecosystem & projects, Types of Hadoop Processing, Hadoop Distributions, Hadoop Installation.
      2. Hadoop MapReduce: Developing a MapReduce Application, MapReduce Internals, MapReduce I/O, Examples illustrating MapReduce Features.
      3. Hadoop Application Architectures: Data Modeling, Data Ingestion, and Data Processing in Hadoop.
      4. Hadoop Projects: Practical Tips for Hadoop Projects.
      5. Hadoop Operations: Planning a Hadoop Cluster, Installation and Configuration, Security, and Monitoring.
      6. Hadoop Case Studies: A series of Group Discussions and Student Project Presentations.

      References

      1. Lecture notes will form the primary reference material for the course.
      2. Specific reading material, papers, and videos will be assigned during the course to augment the lecture notes.
      3. Reference Textbook: Hadoop: The Definitive Guide, 4th Edition, Tom White, O’Reilly Media, Inc.

      Evaluation

      1. Homework Assignments: Students will be assigned 5 homework assignments during the course: 10%
      2. Quizzes: Best 2 out 3 unannounced quizzes will be counted towards the final grade: 10%
      3. Mid-Term Exam: 20%
      4. Final Exam: 40%
      5. Student Project: The students will individually, or optionally in teams of two, research, design, implement, and present a substantial term project: 20%

        Consult our Big Data Career Advisers +91 704 531 4371 /+91 981 900 8153 on how to add wings to your career

      Course description:

      Welcome to IBM InfoSphere BigInsights, a platform for the analysis and visualization of Internet-scale data volumes that is powered by Apache Hadoop, an open source distributed computing platform.

      It is a hand-on training course for those who want a foundation of IBM InfoSphere BigInsights. It will provide an overview of IBM’s Big Data strategy along with more detailed and descriptive information on Hadoop technology. The course also presents concepts required by System Administrator to work with Hadoop Distributed File system and concepts of MapReduce required by a developer. It provides learners an introduction to scheduling capabilities of Hadoop.

      IBM BigInsights delivers a rich set of advanced analytics capabilities that allows enterprises to analyze massive volumes of structured and unstructured data in its native format. The software combines open source Apache Hadoop with IBM innovations including sophisticated text analytics, IBM BigSheets for data exploration, IBM Big SQL for SQL access to data in Hadoop and a range of performance, security and administrative features. The result is a cost-effective and user-friendly solution for complex, big data analytics.

      Attendees will learn how to develop solutions on IBM BigInsights using the various analytical capabilities of the platform.

      The course will give you an introduction in these technologies and will help you to pick the right tool for the task at hand. 

      Course outline:

      • Introduction to big data
      • Introduction to infosphere bigInsights
      • Apache hadoop and HDFS overview
      • Exercise - Exploring apache hadoop
      • GPFS - FPO ( General Parallel File System - File Placement Optimizer)
      • BigInsights web console security
      • Introduction to map reduce
      • Project

      About Technology

      IBM Cognos Business Intelligence turns data into past, present and future views of your organization’s operations and performance so your decision makers can capitalize on opportunities and minimize risks. You can use these views to understand the immediate and downstream effects of decisions that span potentially complex interrelated factors. Consistent snapshots of business performance are provided in enterprise-class reports and independently assembled dashboards based on trusted information. As a result, non- technical and technical business intelligence (BI) users and IT alike can respond quickly to rapidly changing business needs. 

      About Course 
      Descriptive Analytics helps an organization drive better decisions, achieve better results, and gain a deeper understanding of trends, opportunities, weaknesses and threats. IBM Cognos BI facilitates to explore data, in any combination and over any time period with a broad range of analytics capabilities. With the ability to interact, search and assemble all perspectives of a business, IBM Cognos BI provides a limitless BI workspace. This course will be on business perspective with IBM Cognos BI as a tool to achieve the report outputs.

      IBM Business Analytics lab

      'By 2018, there will be a shortage of 1.5 million analysts/managers who can make data-driven decisions', - McKinsey Global Institute's report on Big Data

      Analytics firms in India will soon face a shortage of 2 lakh data scientists- Hindu

      Pre-requisites:

      • Experience using 
         the Windows operating system
         a web browser
      • Basic understanding of RDBMS

      Course Contents

      • Overview of IBM COGNOS 10.2 BI 
      • Identify Common Data Structure
      • Gather Requirements
      • Creating a Baseline Project
      • Introduction to the Reporting Application
      • Create List Reports
      • Focus Reports using Filters
      • Create Crosstab Reports
      • Present Data Graphically
      • Focus Reports using Prompts
      • Extend Reports using Calculations
      • Customize Reports with Conditional Formatting
      • Drill Through From One Report to Another
      • Create a Report using Relational Data

      For Admissions:

      Please send your Resume to admission@aegis.edu.in 
      Get in Touch with Ritin Joshi and Rupali Dutta at 9022137010

      Fundamentals of Enterprise Performance Management using IBM Cognos TM1

      About Course This course will guide modelers how to build a complete model in TM1. Through a series of IBM Cognos® TM1® is an enterprise planning software platform that can transform your entire planning cycle, from target setting and budgeting to reporting, scorecarding, analysis and forecasting. Cognos TM1 enables one to collaborate on plans, budgets and forecasts.

      You can analyze data, and create models – including profitability models – to reflect a constantly evolving business environment. In addition, integrated scorecards and strategy management capabilities help you monitor performance metrics and align resources and initiatives with corporate objectives and market events. Through a series of lectures and hands-on exercises, students will learn how to model, link and deploy TM1 applications.

      Pre-requisites Prior to attending this course, the students need to know

       Basic knowledge of OLAP
       Significant experience with Excel spreadsheets (functions, macros, etc.)
       Understanding of the metrics and drivers of your business

      IBM Business Analytics lab

      Contents

      1. Introduction to Cognos TM1

      2. Understanding Data Dimensions & Cubes

      3. Creation of Data Dimensions , Data Cubes , Views etc

      4. Loading & Maintaining Data

      5. Adding Business Rules

      6. Optimize Rule Performance

      7. Transfer of Data to the Business Models

      8. Customizing Drill Paths

      9. Using Rules for Advanced Modeling

      10. Convert Currencies

      11. Model for Different Fiscal Requirement

      12. Introduction to Managed Planning Applications

      13. Contribute Data to Managed Planning Applications

      14. Creation & Deployment of Managed Planning Applications on the Web

      15. Defining Workflow in an Enterprise

      16. Integration of Cognos TM1 & Cognos BI

      17. Design for Reporting

      18. Optimize and Tune TM1 Models

      About Technology

      IBM Cognos Business Intelligence turns data into past, present and future views of your organization’s operations and performance so your decision makers can capitalize on opportunities and minimize risks. You can use these views to understand the immediate and downstream effects of decisions that span potentially complex interrelated factors. Consistent snapshots of business performance are provided in enterprise-class reports and independently assembled dashboards based on trusted information. As a result, non- technical and technical business intelligence (BI) users and IT alike can respond quickly to rapidly changing business needs. 

      About Course 
      Descriptive Analytics helps an organization drive better decisions, achieve better results, and gain a deeper understanding of trends, opportunities, weaknesses and threats. IBM Cognos BI facilitates to explore data, in any combination and over any time period with a broad range of analytics capabilities. With the ability to interact, search and assemble all perspectives of a business, IBM Cognos BI provides a limitless BI workspace. This course will be on business perspective with IBM Cognos BI as a tool to achieve the report outputs.

      IBM Business Analytics lab

      'By 2018, there will be a shortage of 1.5 million analysts/managers who can make data-driven decisions', - McKinsey Global Institute's report on Big Data

      Analytics firms in India will soon face a shortage of 2 lakh data scientists- Hindu

      Pre-requisites:

      • Experience using 
         the Windows operating system
         a web browser
      • Basic understanding of RDBMS

      Course Contents

      • Overview of IBM COGNOS 10.2 BI 
      • Identify Common Data Structure
      • Gather Requirements
      • Creating a Baseline Project
      • Introduction to the Reporting Application
      • Create List Reports
      • Focus Reports using Filters
      • Create Crosstab Reports
      • Present Data Graphically
      • Focus Reports using Prompts
      • Extend Reports using Calculations
      • Customize Reports with Conditional Formatting
      • Drill Through From One Report to Another
      • Create a Report using Relational Data

      For Admissions:

      Please send your Resume to admission@aegis.edu.in 
      Get in Touch with Ritin Joshi and Rupali Dutta at 9022137010

      PGP_Feb 2016_Career Management Centre

      PGP_Sep 2015_Career Management Centre

      Fundamentals of Enterprise Performance Management using IBM Cognos TM1

      About Course This course will guide modelers how to build a complete model in TM1. Through a series of IBM Cognos® TM1® is an enterprise planning software platform that can transform your entire planning cycle, from target setting and budgeting to reporting, scorecarding, analysis and forecasting. Cognos TM1 enables one to collaborate on plans, budgets and forecasts.

      You can analyze data, and create models – including profitability models – to reflect a constantly evolving business environment. In addition, integrated scorecards and strategy management capabilities help you monitor performance metrics and align resources and initiatives with corporate objectives and market events. Through a series of lectures and hands-on exercises, students will learn how to model, link and deploy TM1 applications.

      Pre-requisites Prior to attending this course, the students need to know

       Basic knowledge of OLAP
       Significant experience with Excel spreadsheets (functions, macros, etc.)
       Understanding of the metrics and drivers of your business

      IBM Business Analytics lab

      Contents

      1. Introduction to Cognos TM1

      2. Understanding Data Dimensions & Cubes

      3. Creation of Data Dimensions , Data Cubes , Views etc

      4. Loading & Maintaining Data

      5. Adding Business Rules

      6. Optimize Rule Performance

      7. Transfer of Data to the Business Models

      8. Customizing Drill Paths

      9. Using Rules for Advanced Modeling

      10. Convert Currencies

      11. Model for Different Fiscal Requirement

      12. Introduction to Managed Planning Applications

      13. Contribute Data to Managed Planning Applications

      14. Creation & Deployment of Managed Planning Applications on the Web

      15. Defining Workflow in an Enterprise

      16. Integration of Cognos TM1 & Cognos BI

      17. Design for Reporting

      18. Optimize and Tune TM1 Models

      This program provides insight to participants on business applications and use cases in different functional areas for Government and various industries like telecom, retail, ecommerce, automobile, hospitality, airline, healthcare, banking & finance etc. 

      Use cases of Big Data and Data Science

      Fundamentals of Data Science

      There is much debate among scholars and practitioners about what data science is, and what it isn’t. Does it deal only with big data? What constitutes big data? Is data science really that new? How is it different from statistics and analytics?

      One way to consider data science is as an evolutionary step in interdisciplinary fields like business analysis that incorporate computer science, modeling, statistics, analytics, and mathematics. Data science is the study of the generalizable extraction of knowledge from data, yet the key word is science. It incorporates varying elements and builds on techniques and theories from many fields, including signal processing, mathematics, probability models, machine learning, statistical learning, computer programming, data engineering, pattern recognition and learning, visualization, uncertainty modeling, data warehousing, and high performance computing with the goal of extracting meaning from data and creating data products.

      From government, social networks and ecommerce sites to sensors, smart meters and mobile networks, data is being collected at an unprecedented speed and scale. Data science can put big data to use.

      Factoid_Likes

      Average number of “likes” and “comments” posted on facebook daily.

      Factoid_Percent

      Percentage of the world’s data that has been produced in the last two years.

      Factoid_eComm

      Projected volume of e-commerce transactions in 2016.

      Data Science is not restricted to only big data, although the fact that data is scaling up makes big data an important aspect of data science.

      Python is a powerful, flexible, open-source language that is easy to learn, easy to use, and has powerful libraries for data manipulation and analysis. This is among the top three along with R and SAS for analytics application. With origination as an open source scripting language, Python usage has grown over time. Today, it sports libraries (numpy, scipy and matplotlib) and functions for almost any statistical operation / model building you may want to do. Since introduction of pandas, it has become very strong in operations on structured data. Its simple syntax is very accessible to programming novices, and will look familiar to anyone with experience in Matlab, C/C++, Java, or Visual Basic. Python has a unique combination of being both a capable general-purpose programming language as well as being easy to use for analytical and quantitative computing.

      For over a decade, Python has been used in scientific computing and highly quantitative domains such as finance, oil and gas, physics, and signal processing. It has been used to improve Space Shuttle mission design, process images from the Hubble Space Telescope, and was instrumental in orchestrating the physics experiments which led to the discovery of the Higgs Boson (the so-called "God particle"). At the same time, Python has been used to build massively scalable web applications like Youtube, and has powered much of Google's internal infrastructure. Python is easy for analysts to learn and use, but powerful enough to tackle even the most difficult problems in virtually any domain. It integrates well with existing IT infrastructure, and is very platform independent. Among modern languages, its agility and the productivity of Python-based solutions is legendary. Companies of all sizes and in all areas - from the biggest investment banks to the smallest social/mobile web app startups - are using Python to run their business and manage their data.

      The course on the Python programming language and standard library with a focus on applying Python to problems in scripting, data analysis, and systems programming. This course assumes no prior experience with Python, but does assume participants know how to program in another programming language. It is especially well-suited for programmers who want to know what Python is all about without extra fluff.

      Course Content:

      • Strings and Lists
      • Loops
      • Functions
      • Tuples, Dictionaries and Files
      • Regular Expressions
      • NLTK

      Statistics and Probability for Data Analysis

      Decision making process under uncertainty is largely based on application of statistical data analysis for probabilistic risk assessment of your decision. Managers need to understand variation for two key reasons. First, so that they can lead others to apply statistical thinking in day to day activities and secondly, to apply the concept for the purpose of continuous improvement. This course will provide you with hands-on experience to promote the use of statistical thinking and techniques to apply them to make educated decisions whenever there is variation in business data.

      This course exposes you to elementary statistics and probability concepts that provide the necessary foundation for application in data analysis.

      Statistics         

      •             Dataset
      •             Descriptive
      •             Percentiles
      •             Inferential
      •             Scaling
      •             Normalization
      •             Transformation
      •             Outliers Detection/Fixation
      •             Sparseness Elimination
      •             Sampling
      •             Exploratory Data Analysis
      •             ANOVA
      •             Correlation Analysis
      •             Feature Selection
      •             Uncorrelated Feature Elimination

      Probability      

      •             Probability Theory
      •             Distribution
      •             Hypothesis Testing
      •             Significant (p-) value
      •             Maximum Likelihood
      •             Monte Carlo
      •             Regression Analysis
      •             Time Series

      Machine Learning and Data Mining

      This Courses offers a thorough grounding in machine learning concepts as well as practical advice on applying machine learning tools and techniques in real-world data mining situations. This will teach you everything you need to know about preparing inputs, interpreting outputs, evaluating results, and the algorithmic methods at the heart of successful data mining.

      Course Content/Syllabus of Machine Learning and Data Mining

      • Introduction
        • Machine Learning Basics
        • Data Mining Basics
        • Supervised and Unsupervised Learning
      • Regression Analysis
        • Linear Regression
        • Logistic Regression
      • Regularization
        • Bias
        • Variance
        • Over-fitting
      • Classifiers
        • Decision Trees & Random Forest
        • k-Nearest Neighbor
        • Support Vector Machines
      • Clustering
        • K-Means Clustering
      • Applying Machine Learning
        • Model Selection
        • Train/Test/Validation sets
        • Learning curves
      • Machine Learning Model Design
        • Error Analysis
        • Error Metric
        • Precision and Recall
      • Dimensionality Reduction
        • Principal Component Analysis

      R for Business Analytics

      During the last decade, the momentum coming from both academia and industry has lifted the R programming language to become the single most important tool for computational statistics, visualization and data science. Worldwide, millions of statisticians and data scientists use R to solve their most challenging problems in fields ranging from computational biology to quantitative marketing. R has become the most popular language for data science and an essential tool for Finance and analytics-driven companies such as Google, Facebook, and LinkedIn.

      Every data analysis technique at your fingertips

      R includes virtually every data manipulation, statistical model, and chart that the modern data scientist could ever need. You can easily find, download and use cutting-edge community-reviewed methods in statistics and predictive modeling from leading researchers in data science, free of charge.

      Create beautiful and unique data visualizations

      Representing complex data with charts and graphs is an essential part of the data analysis process, and R goes far beyond the traditional bar chart and line plot. Heavily influenced by thought leaders in data visualization like Bill Cleveland and Edward Tufte, R makes it easy to draw meaning from multidimensional data with multi-panel charts, 3-D surfaces and more. The custom charting capabilities of R are featured in many of the stunning infographics seen in the New York Times, The Economist, and the FlowingData blog.

      Get better results faster

      Instead of using point-and-click menus or inflexible "black-box" procedures, R is a programming language designed expressly for data analysis. Intermediate level R programmers create data analyses faster than users of legacy statistical software, with the flexibility to mix-and-match models for the best results. And R scripts are easily automated, promoting both reproducible research and production deployments.

      Draw on the talents of data scientists worldwide

      As a thriving open-source project, R is supported by a community of more than 2 million users and thousands of developers worldwide. Whether you're using R to optimize portfolios, analyze genomic sequences, or to predict component failure times, experts in every domain have made resources, applications and code available for free, online.

      Course

      This courses introduces you to the R statistical processing language, including how to install R on your computer, read data from SPSS and spreadsheets, and use packages for advanced R functions.

      The course continues with examples on how to create charts and plots, check statistical assumptions and the reliability of your data, look for data outliers, and use other data analysis tools. Finally, learn how to get charts and tables out of R and share your results with presentations and web pages.

      Course Content

      • What is R?
      • Installing R
      • Data Types
      • Import/Export Data
      • Operators
      • Built-in Functions
      • Control Structures
      • User-defined Functions
      • Data Management
      • Debugging

      Post Graduate Program (PGP) in Data Science, Business Analytics & Big Data in association with IBM
      A True Holistic Data Science Program


      "There were 5 exabytes of information created between the dawn of civilization through 2003, but that much information is now created in every 2 days" 
      - Eric Schmidt, Executive Chairman, Google

      Every day, we create 2.5 quintillion bytes of data — so much that 90% of the data in the world today has been created in the last two years alone. This data comes from everywhere: sensors used to gather climate information, posts to social media sites, digital pictures and videos, purchase transaction records, and cell phone GPS signals to name a few. This data is big data and mainly constitutes the unstructured data. This Big Data offers challenge in term of storage and further analysis in rest of in real time. One can dig gold mine if we are able to make sense out of big data.

      In a global and highly interconnected world we live in, the one greatest power is the right information. Data Science, Big Data and Business Analytics is thus the primary tool for any organisation, society, Govt. to seek for competitive advantage and optimisation of the existing ecosystem. A career in Business Analytics, Big Data and Data Science promise a lot of challenge, opportunity, and reward.

      Modern Data Scientists are Supermen with Crystal Ball

      Data Scientists are super men who tell stories out of the data, respective of the size of data: big or small. They answer questions like what’s happening. What will happen? What we should do now? These are commonly termed as BI, Predictive analytics and Prescriptive Analytics. They use statistics, Machine learning, NLP, R, Python, Hadoop, Spark to create Crystal Ball to predict future, solve business problem or find out the missing opportunities. Snapdeal, which has over 20 million subscribers and generates terabytes of data through the interactions that happen with customers in addition to a catalogue of over 5 million, churns 15 million data points (related data set like a consumer shopping on specific days for a particular thing) within two hours, using Hadoop. About 35% of its orders come from recommendation and personalized systems, and the conversion rate of such orders is 20-30% higher than normal orders, the company claims.

      Post Graduate Program in Data Science, Business Analytics and Big Data  (PGP-DS-BA-BigData)

      Post Graduate Program in Data Science, Business Analytics and Big Data  (PGP-DS-BA-BigData) is India’s first high end data science program designed and delivered by Aegis School of Business, Data Science & Telecommunication in association with IBM and to train the new generation of data-savvy professionals. This 11 months program provides you intensive hands-on training to develop the necessary and unique set of skills required for successful career in the fastest growing and intellectually stimulating fields of Data Science, Big Data, Business Analytics, Predictive Analytics, NLP, ML and Cognitive Computing. Students who earn a PGP develop deep quantitative capabilities and technical expertise and are equipped to create business and social value by extracting useful insights and applying it in a various industries by playing a role of Data Scientist, Big Data Adviser and modern Business Analyst.

      This program is delivered at Centre for Excellence in Telecom Technology and Management (CETTM), MTNL's world class campus with state of art "IBM Business Analytics Lab" and "IBM Cloud Computing Lab" at Powai in Mumbai.

       

      Sheetal Soni, Country Channel Manager, IBM India speaking on PGP in Business Analytics and Big Data in association with Aegis

      Aegis School of Business, Data Science & Telecommunication 
      Aegis is a leading School in the field of Telecom Management/Technology Management started in 2002 with support of Bharti Airtel to develop future telecom leaders. Aegis offers various programs in 25 countries to top executives from top IT/Telecom firms. In 2015 Aegis has joined hands with IBM to offer high end courses in the field of Business Analytics, Big Data, Cloud Computing and Mobility. Check about Aegis at www.aegis.edu.in  and www.bellaward.com

      IBM

      To meet growing client demands in today’s data intensive industries, IBM has established the world’s deepest and broadest portfolio of Big Data and Analytics technologies and solutions, spanning services, software, research and hardware. Fueling this is the fact that IBM has made a multi-billion dollar investment in big data and analytics — $24 billion since 2005 through both acquisitions and R&D — to extend its leadership position in this strategic market. IBM Named A Leader In IDC MarketScape For Business Analytics practice. 

      Why PGP/MS in Data Science, Business Analytics and Big Data program? 

      Aegis is a leader in Data Science, Business Analytics and Big Data. Read what makes Aegis PGP in Business Analytics & Big Data, First Data Science Program of India In association with IBM the best program in India?

      Highlights of the Program

      PGP program Certification from IBM

      Certification from IBM, world leader in Big Data Analytics at the completion of the program.

      Best Data Science curriculum

      Program curriculum designed as per the Data Scientist’s skills and competencies framework with the help of leading Data Scientists. This program has wide rage of core and elective courses which no other institute aound the world offers. Check list of courses at bottom

      Ranked Among Top Ten Business Anlytical Courses in India

      Ranked among Top Ten programs for Business Anlaytics programs in India by Analytics Magzine. Check the ranking

      True Data Science Program and NOT an MBA/PGDBM in Business Anlytics

      This Program is core technology MS in Data Science program and NOT a inferior cousin of MBA/PGDBM programs in Business Analytics. This program lays down your foundation for serious Developer, Data Scientist Career or Big Data Advisory & consulting roles.

      IBM Business Analytics Lab

       IBM has setup an IBM Business Analytics and IBM Cloud Computing Lab in the campus. The program is delivered by IBM subject matter experts and best Data Scientists from around the world.

      Big Data Product Factory

      Live Projects: work on live projects like Churn Predication, Call drop analysis from leading teleco; NLP projects, consumer analytics, sentiment analysis etc at Aegis Big Data Product Factory as part of program.

      Exposure to wide range of Tech, Tools and S/W

       Hands on exposure on IBM DB2, IBM Cognos TM1, IBM Cognos Insight, IBM InfoSphere Big Insight, IBM Worklight, IBM BlueMix, R, Python, SAS, Hadoop, MapReduce, Spark, ElasticSearch, Tableau, EMR, AWS etc

      World class infrastructure

      On campus classes held at world class infrastructure at Aegis, CETTM, MTNL at Powai, Mumbai. 

      CMC, Internship and Placement

      Career Management Center (CMC) at Aegis facilitates all students excellent paid internship for 2 to 3 months with various companies which generally leads to final placement as role of Data Scientist, Manager Data Science, Business Analyst, Risk Analyst etc with companies like IBM, Persistent, HDFC, Loginext, Angelbrooking, Creditvidya, L&T infotech, Virtusa, Ixsight, Pentation, Clover Infotech, L & T Finance, Guppy Media Incorporation etc. This years PGP in Data Science, Business Analytics and Big placement record:

      Fresher Lowest Package: 8.5 lacs
      Fresher highest Package: 11 lacs 
      Experience Candidates got over 100% hike on last Package. *For experience up to 7 to 8 years.
      90% got Data Scientist Role
      Paid internship for everyone for 2 to 3 months 
      *Aegis does not provide any kind of job guarantee

      Course Modules taught by IBM

      IBM faculty will be teaching three courses: (i) Business Intelligence using Cognos BI (ii) Big Data Analytics using IBM InfoSphere Big Insight and (iii) Enterprise Performance Management using IBM Cognos TM1

      Scholarships

      Deans Scholarship for Women in Technology Leadership; Aegis Graham Bell Award Scholarship for Big Data Products; Data Science Scholar. Study Loan by HDFC Credila.

      Globally acceptable Credit Structure

      This program follows Globally acceptable 45 Credit Unit for master degree. In North America any master degree is min 36 Credit Unit. This program is spread over 9 months plus 2-3 months Internship or consulting assignment.

      Skill for various industries & Functional areas

      The curriculum caters to the skill requirements in various industries like Ecommerce, Telecom, Banks, Computer Services, Education, Healthcare, Insurance, Manufacturing, Retail and other industries. 

      Diverse student profile

      Aegis participants' are from around the world with experience ranging from 2 year up to 25 years from diverse industry and background.

      DSC_3045

      Center of Innovation

      Due to Aegis's initiative of Aegis Graham Bell Awards for innovation in TIME (Telecom, Internet, Media, Edutainment) and SMAC (Social, Mobile, Analytics and Cloud), Aegis students get to know the best of the innovations and their innovators. Hon'ble Dr. Jitender Singh, Minister of State in the Prime Ministers Office; Department of Atomic Energy and Department of Space honored the winners of Aegis Graham Bell Awards 2015 for their breakthrough innovations. Check the winners under Analytics and Big Data categories. 

      Delivered by Best Data Scientist

      This program is delivered by best Data Scientist engaged in real-life Data Science and Big Data Analytics projects from around the world. Check profiles....

      World Class LMS on Cloud

      Aegis uses world class Learning Management System (LMS) on cloud called mUniversity(mUni). All lectures, tons of study material, ebooks, business cases, video lectures available on LMS. Check sample session of Mr. Srikanth Velamakanni, CEO, Fractal Analytics at mUni......

      Big Data Day leadership Series

      Aegis Big Data Day Meetup provides an opportunity to meet Data Science, Big Data, Analytics leaders. Register for latest Meetup

            

       Download Aegis Business Analytics and Big Data Brochure

      Is Business Analytics, Big Data and Data Science for me?

      If you enjoy working data in different ways, exploring the stories that data may reveal, and even experimenting with visualization techniques, you may enjoy a role in data science. Whether you have an academic background in business, economics, technology, statistics, mathematics, or engineering, you are looking for a career change that will prepare you to generate high impact. You search for a challenging role that allows you to integrate the value of data into all business functions. Data analysis is necessary in almost every field and industry and Business Analysts, Big Data professionals and Data Scientists are expected to play a role in guiding strategic decisions. They are in a unique position to assist leaders in determining the right questions to ask as well as in interpreting the answers that data provides to help determine the organization's most effective courses of action.

      Job Opportunities 

      While there is a large and growing demand for professionals trained to collect data and use analytics to drive performance, there is currently a deficit in the supply of these individuals in the global job market. The deficit in the market for data/ Big Data analytics professionals is growing exponentially and poses a serious challenge for recruiters, but it is an enviable opportunity for professionals who have the right skills and training to fill these positions.

      Program Models:

      Full Time Post Graduation Program (OnCampus)

    • Total Program: 36 Credit Units

    • Program Duration: 11 months

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development + 2 to 3 months of Internships

    • 3 terms each of 3 months

      View details and Apply

    •  

      Part Time Executive Program in Weekend model (OnCampus)

    • Total Program: 36 Credit Units

    • Program Duration: 11 Months

    • Sat and Sun on-campus classes in Powai spread over 9 months

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment 

      View details and Apply

    •  

      Part Time Executive Program in Hybrid model (OnCampus + Online)

    • Total Program: 36 Credit Units

    • Program Duration: 11 Months

    • One Block of 5 days in every 3 months plus Sat and Sun online classes spread over 11 months 

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment 

      View details and Apply 

    •           

      Part Time Executive Program in Online model (Online Live Interactive)

    • Total Program: 36 Credit Units

    • Program Duration: 11 Months

    • Live interactive Classes on Saturday and Sunday

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment      

      View details and Apply

    •    

       Download Aegis Business Analytics and Big Data Brochure

      Eligibility Criteria for Admissions: 

      - Working executive with bachelor's degree.

      Admission process:

      • Online application (click on Apply button on top of this page to register and apply)
      • Personal Interview
      • admission offer (if selected)

      Get in Touch with Aegis Big Data Career Adviser Team at  +91 704 531 4371/ +91 882 808 4908/ +91 902 213 7010
      You can also email your Resume to admission@aegis.edu.in and request a call back for consultation 

      Scholarships: 

      Dean’s Scholarship for Women in Technology Leadership

      Saluting the women leaders in Technology, Aegis has announced the “Dean’s Scholarship for Women in Technology Leadership”  

      Who should apply: 

      Aspiring Women Leaders. Please Check Eligibility Criteria above.  

      What's you get:

      • 100% scholarship to study Full Time PGP in Business Analytics and Big Data Program 
      • Learn and work on Big Data Product Factory
      • Come out as Data Scientist or Entrepreneur

      Data Science Scholar

      Who should apply:

      • Phd, Mtech, Msc, MS in Computer science/Physics/Maths or statistics/Economics
      • or If you have developed some data product 

      What's you get:

      • 100% scholarship to study Full Time PGP in Business Analytics and Big Data Program 
      • Learn and work on Big Data Product Factory
      • Job offer to work as Data Scientist ‎at Big Data Product Factory and Aegis ( this is optional )

      Aegis Graham Bell Award Scholarship for Big Data Products for startups

      What you get: 

      Scholarship 
      For Executive PGP in
      Business Analytics & Big Data in association with IBM

      Workspace
      At Aegis Big Data Product Factory
      in Mumbai.

      Manpower
      Access to Highly Skilled Big Data Manpower to work on your Product/Solution.

      Mentorship
      From Business Leaders, Data Scientists & Functional Heads

      Recognition
      Get Recognized at The Largest Innovation Award in TIME & SMAC
      'Aegis Graham Bell Awards'.

      Networking
      Opportunity to network with Business leaders, Functional Heads, Peers & VC's.

      POC
      Help in POC

      Funding
      Help in fund raising and introduction with VC's 

      Who should apply:

      • Start up firms 
      • Individual who have developed some data product 
      • If you have a break through idea on Big Data product/ solution/ platform and you want to develop it   
      • check also www.bellaward.com

      How to Apply for Scholarship: Click on the top apply button and submit application form. Admissions and Scholarship application is same. 

      Aegis School Of Business, Data Science & Telecommunication
      CETTM, MTNL, Technology Street, | Hiranandani Gardens, Powai, Mumbai – 400076, India
      Mahesh Block –B, Sector 15, Central Business District (CBD), Belapur, New Mumbai 400614, India
      www.aegis.edu.in 

      Operational Analytics enables near real-time analysis of processes, thereby helping you to optimize your operations by efficiently identifying and isolating inefficiencies and failures. The data for decision making and discovery may be derived from different sources such as your internal database or externally from social media platforms. It can be hosted over the cloud and can be accessed through mobile devices. Business analytics coupled with data mining, self-service, predictive analytics and aggregation tools is used to get more transparent information for critical decision making and implementing decision support system.

      Operational Analytics helps in:

      • Gain visibility and deep insights into your business processes
      • Understand potential issues and take rapid action to improve performance
      • Have faster access to opportunities and threats from the markets
      • Make quick decisions by having ready access to information in near real time

      Our goal is to introduce students to a powerful class of model, the Neural Network. In fact, this is a broad term which includes many diverse models and approaches. We will first motivate networks by analogy to the brain. The analogy is loose, but serves to introduce the idea of parallel and distributed computation.

      We then introduce one kind of network in detail: the feedforward network trained by backpropagation of error. We discuss model architectures, training methods and data representation issues. We hope to cover everything you need to know to get backpropagation working for you. A range of applications and extensions to the basic model will be presented in the final section of the module.

      • Introduction
      • Classification
      • Optimizing Linear Networks
      • The Backprop Toolbox
      • Unsupervised Learning
      • Reinforcement Learning
      • Advanced Topics

      What do web search, speech recognition, face recognition, machine translation, autonomous driving, and automatic scheduling have in common? These are all complex real-world problems, and the goal of artificial intelligence (AI) is to tackle these with rigorous mathematical tools. In this course, you will learn the foundational principles that drive these applications and practice implementing some of these systems. Specific topics include machine learning, search, game playing, Markov decision processes, constraint satisfaction, graphical models, and logic. The main goal of the course is to equip you with the tools to tackle new AI problems you might encounter in life.

      • Overview of AI
      • Statistics, Uncertainty, and Bayes networks
      • Machine Learning
      • Logic and Planning
      • Markov Decision Processes and Reinforcement Learning
      • Hidden Markov Models and Filters
      • Adversarial and Advanced Planning
      • Game playing

      Python is a powerful, flexible, open-source language that is easy to learn, easy to use, and has powerful libraries for data manipulation and analysis. This is among the top three along with R and SAS for analytics application. With origination as an open source scripting language, Python usage has grown over time. Today, it sports libraries (numpy, scipy and matplotlib) and functions for almost any statistical operation / model building you may want to do. Since introduction of pandas, it has become very strong in operations on structured data. Its simple syntax is very accessible to programming novices, and will look familiar to anyone with experience in Matlab, C/C++, Java, or Visual Basic. Python has a unique combination of being both a capable general-purpose programming language as well as being easy to use for analytical and quantitative computing.

      For over a decade, Python has been used in scientific computing and highly quantitative domains such as finance, oil and gas, physics, and signal processing. It has been used to improve Space Shuttle mission design, process images from the Hubble Space Telescope, and was instrumental in orchestrating the physics experiments which led to the discovery of the Higgs Boson (the so-called "God particle"). At the same time, Python has been used to build massively scalable web applications like Youtube, and has powered much of Google's internal infrastructure. Python is easy for analysts to learn and use, but powerful enough to tackle even the most difficult problems in virtually any domain. It integrates well with existing IT infrastructure, and is very platform independent. Among modern languages, its agility and the productivity of Python-based solutions is legendary. Companies of all sizes and in all areas - from the biggest investment banks to the smallest social/mobile web app startups - are using Python to run their business and manage their data.

      The course on the Python programming language and standard library with a focus on applying Python to problems in scripting, data analysis, and systems programming. This course assumes no prior experience with Python, but does assume participants know how to program in another programming language. It is especially well-suited for programmers who want to know what Python is all about without extra fluff.

      Course Content:

      • Strings and Lists
      • Loops
      • Functions
      • Tuples, Dictionaries and Files
      • Regular Expressions
      • NLTK

       RHAdoop

      RHadoop is a collection of R packages that allow users to manage and analyze data with Hadoop. The packages have been tested (and always before a release) on recent releases of the Cloudera and Hortonworks Hadoop distributions and should have broad compatibility with open source Hadoop and mapR's distribution.

      Three packages of note — rmr2, rhdfs, and rhbase — provide most of RHadoop’s functionality:

      • rmr2: The rmr2 package supports translation of the R language into Hadoop-compliant MapReduce jobs (producing efficient, low-level MapReduce code from higher-level R code).

      • rhdfs: The rhdfs package provides an R language API for file management over HDFS stores. Using rhdfs, users can read from HDFS stores to an R data frame (matrix), and similarly write data from these R matrices back into HDFS storage.

      • rhbase: rhbase packages provide an R language API as well, but their goal in life is to deal with database management for HBase stores, rather than HDFS files.

       

      Hadoop Installation

      Career Management Cenre

      Financial analytics is the creation of ad hoc analysis to answer specific business questions and forecast possible future financial scenarios. Financial analysis software can speed up the creation of reports and present the data in an executive dashboard, a graphical presentation that is easier to read and interpret than a series of spreadsheets with pivot tables.
      Financial analytics is a discipline that helps to take multiple and granular views of a company’s financial data and use it to gain insight and take action. Although financial analytics has a wide reach and touches all parts of your company, it is especially useful in the context of profitability—a very important component of business success and performance management. Today’s business challenges demand timely financial information that enables executives, managers, and front-line employees to make better decisions, take action, and correct problems before they affect the company’s financial performance. Achieving better business performance requires more than just improving process efficiency or enhancing financial reporting. Rather, it requires improving effectiveness by leveraging advanced analytics that integrate data from across the organization and provide insight to the people who can impact business performance. To fully understand the factors driving the business requires timely financial information combined with data from across the company value chain. Frequently, the answers do not reside solely in the company’s financial reporting systems. To get the most complete picture requires integrating information from the company’s human resources, supply chain, customer relationship management, and financial management systems—and turning it into integrated and actionable insight.
      Popular financial analysis software programs include:
      Oracle Financial Analytics - modular component of Oracle’s integr ated family of business intelligence (BI) software applications. Enables insight into the general ledger, provides visibility into performance against budget and the way staffing costs and employee or supplier performance affects revenue and customer satisfaction.
      SAP ERP Financial Analytics - help organizations define financial goals, develop business plans and monitor costs and revenue during execution.
      SAS Business Analytics - provides an integrated environment for data mining, text mining, simulation and predictive modeling -- a mathematical model that predicts future outcomes – as well as descriptive modeling, a mathematical model that describes historical events and the relationships that created them.
      IBM Cognos Finance - provides out of the box data analysis capabilities for sales, supply chain procurement and workforce management functions.
      NetSuite - provides financial dashboards, reporting and analytic functions that allow personal key performance indicators (KPis) to be monitored in real time.

      Introduction to Data Science

      There is much debate among scholars and practitioners about what data science is, and what it isn’t. Does it deal only with big data? What constitutes big data? Is data science really that new? How is it different from statistics and analytics?

      One way to consider data science is as an evolutionary step in interdisciplinary fields like business analysis that incorporate computer science, modeling, statistics, analytics, and mathematics. Data science is the study of the generalizable extraction of knowledge from data, yet the key word is science. It incorporates varying elements and builds on techniques and theories from many fields, including signal processing, mathematics, probability models, machine learning, statistical learning, computer programming, data engineering, pattern recognition and learning, visualization, uncertainty modeling, data warehousing, and high performance computing with the goal of extracting meaning from data and creating data products.

      From government, social networks and ecommerce sites to sensors, smart meters and mobile networks, data is being collected at an unprecedented speed and scale. Data science can put big data to use.

      Factoid_Likes

      Average number of “likes” and “comments” posted on facebook daily.

      Factoid_Percent

      Percentage of the world’s data that has been produced in the last two years.

      Factoid_eComm

      Projected volume of e-commerce transactions in 2016.

      Data Science is not restricted to only big data, although the fact that data is scaling up makes big data an important aspect of data science.

      Course Coverage:

      • Introduction to Data Science
        • Data  in Data science ecosystem
      • Data sets , Training , Testing data sets
        • Volume ,Variety,Velocity and Values
        • Structured , Unstructured Data, Text Data
        • Meta Data Modelling /KDM standard
        • MOF , KDD and KDDML
        • Datamining Group and PMML standard
        • Unified Modelling language - Meta Data modelling
        • Text Data  and Classification
        • Global Standards - UNSPSC , DBPedia
        • Datascience  solution using platform . Iaas , Paas and Saas based solution approach
      • Big Data  and Big Data Technology , Tools and Platform
        • Why Big Data ?
        • Hadoop Framework
        • Map Reduce
        • YARN
        • HDFS
      • Services attached with Hadoop Framework (Hbase , Hive , Zookeeper , Cassandra , MongoDB)
        • API and Its Integration Model
      • Data Science Platform
        • Virtual Infrastructure Platform and Public Cloud
        • AWS Elastic Map Reduce  Platform
        • Apache Spark , Spark SQL  , Apache Storm
        • Machine Learning Platform
        • API design and Model  for platform
        • Data Platform - MapR
      • Data Science Services Platform
        • Data set Design and Model  using UML Infrastructure and MOF
        • KDM and KDDML for Numeric Data
        • Content /Text Data  modeling
        • Text/Content Extraction
        • XML Pipeline for Text aggregation / transformation
        • Semantic Content , Annotation
        • OWL/RDF /RDF Graph standard
        • Ontology , Vocabulary , Linked Platform
        • UIMA and NLP for text analysis
      • Algorithm  and Machine Learning
        • Classification of Data - Support Vector Machine,
        • Clustering of Data - K means
        • Collaborative Filtering  & Recommendation Engine / easyrec engine
        • Business Intelligence Platform

      With the emergence of mobile technology, everything around us is now connected to the Internet of Things – creating a powerful phenomenon that changes both how we work and how we live. Nine billion Internet of Things (IoT) devices today, and the number is growing. Organizations must consider how to manage the complexities of connecting to a seemingly unlimited number of devices and how to effectively integrate IoT data with data from other sources, such as internal data stores. Business value comes to those who improve their data capabilities—integration, automation and analysis—not to those who simply connect the most devices to the network.

      This course considers at the Internet of Things (IoT) as the general theme of real-world things becoming increasingly visible and actionable via Internet and Web technologies. The goal of the course is to take a top-down as well as a bottom-up approach, thereby providing students with a comprehensive understanding of the IoT: from a technical viewpoint as well as considering the societal and economic impact of the IoT.

      By looking at a variety of real-world application scenarios of the IoT and diverse implemented applications, the various understandings and requirements of IoT applications become apparent. This allows students to understand what IoT technologies are used for today, and what is required in certain scenarios. By looking at a variety of existing and developing technologies and architectural principles, students gain a better understanding of the types of technologies that are available and in use today and can be utilized to implement IoT solutions. Finally, students will be given the opportunity to apply these technologies to tackle scenarios of their choice in teams of two or three, using an experimental platform for implementing prototypes and testing them as running applications. At the end of the semester, all project teams will present their completed projects.

      • By looking at a variety of real-world application scenarios of the IoT and diverse implemented applications, the various understandings and requirements of IoT applications become apparent. This allows students to understand what IoT technologies are used for today, and what is required in certain scenarios.
      • By looking at a variety of existing and developing technologies and architectural principles, students gain a better understanding of the types of technologies that are available and in use today and can be utilized to implement IoT solutions.
      • Finally, students will be given the opportunity to apply these technologies to tackle scenarios of their choice in teams of two or three, using an experimental platform for implementing prototypes and testing them as running applications. At the end of the course, all project teams will present their completed projects.

      This course is looking at technical and business aspects of the Internet of Things. There will be project groups that tackle projects looking at specific application scenarios. This is not primarily an implementation-focused course and different skill levels of programming are welcome, but some familiarity with programming is expected.

      • What is the Internet of Things (IoT)?
      • Overview of IoT  connectivity methods and technologies
      • Evaluation of The Internet Things
      • Internet of Things Security
      • Business use case of IOT
      • IBM Watson IoT platform
      • Project 

      Visualisation

      Visualisation is commonly known as Data visualisation is viewed by many disciplines as a modern equivalent of visual communication. It is not owned by any one field, but rather finds interpretation across many (e.g. it is viewed as a modern branch of descriptive statistics by some, but also as a grounded theory development tool by others). It involves the creation and study of the visual representation of data, meaning "information that has been abstracted in some schematic form, including attributes or variables for the units of information".

      A primary goal of data visualization is to communicate information clearly and efficiently to users via the information graphics selected, such as tables and charts. Effective visualization helps users in analyzing and reasoning about data and evidence. It makes complex data more accessible, understandable and usable. Users may have particular analytical tasks, such as making comparisons or understanding causality, and the design principle of the graphic (i.e., showing comparisons or showing causality) follows the task. Tables are generally used where users will look-up a specific measure of a variable, while charts of various types are used to show patterns or relationships in the data for one or more variables.

      Data visualization is both an art and a science. The rate at which data is generated has increased, driven by an increasingly information-based economy. Data created by internet activity and an expanding number of sensors in the environment, such as satellites and traffic cameras, are referred to as "Big Data". Processing, analyzing and communicating this data present a variety of ethical and analytical challenges for data visualization. The field of data science and practitioners called data scientists have emerged to help address this challenge.

      Course Content:

      • Data Exploration in R                                 
      • Line Chart
      • Scatter Plot
      • Histogram
      • Boxplot     
      • ggplot       
      • Tree Maps
      • D3.js
      • InfoVis

        

      Certification from TSSC (Telecom Sector Skill Council), a non-profit industry driven body set up under the aegis of the NSDC (National Skill Development Council), at the completion of the program.

       

      Cisco Certified Network Associate (CCNA) Routing and Switching is a certification program for entry-level network engineers that helps maximize your investment in foundational networking knowledge and increase the value of your employer's network. CCNA Routing and Switching is for Network Specialists, Network Administrators, and Network Support Engineers with 1-3 years of experience. The CCNA Routing and Switching validates the ability to install, configure, operate, and troubleshoot medium-size routed and switched networks.

      Course Curriculum:

      • Introduction to Networks
        • Course Introduction 
        • Exploring the Network
        • Configuring a Network Operating System 
        • Network Protocols and Communications
        • Network Access 
        • Ethernet
        • Network Layer 
        • Transport Layer
        • IP Addressing 
        • Subnetting IP Networks
        • Application Layer 
        • It’s a Network
      • Routing and Switching Essentials
        • Course Introduction
        • Introduction to Switched Networks (Text Only Version)
        • Basic Switching Concepts and Configuration
        • VLANs (Text Only Version)
        • Routing Concepts
        • Inter-VLAN Routing (Text Only Version)
        • Static Routing
        • Routing Dynamically (Text Only Version)
        • Single-Area OSPF
        • Access Control Lists (Text Only Version)
        • DHCP
        • Network Address Translation for IPv4
      • Scaling Networks
        • Course Introduction (Text Only Version)
        • Introduction to Scaling Networks
        • LAN Redundancy (Text Only Version)
        • Link Aggregation
        • Wireless LANs (Text Only Version)
        • Adjust and Troubleshoot Single-Area OSPF
        • Multiarea OSPF (Text Only Version)
        • EIGRP
        • EIGRP Advanced Configurations and Troubleshooting (Text Only Version)
        • IOS Images and Licensing
      • Connecting Networks
        • Course Introduction
        • Hierarchical Network Design (Text Only Version)
        • Connecting to the WAN
        • Point-to-Point Connections (Text Only Version)
        • Frame Relay
        • Network Address Translation for IPv4 (Text Only Version)
        • Broadband Solutions
        • Securing Site-to-Site Connectivity (Text Only Version)
        • Monitoring the Network
        • Troubleshooting the Network

        

      Certification from TSSC (Telecom Sector Skill Council), a non-profit industry driven body set up under the aegis of the NSDC (National Skill Development Council), at the completion of the program.

      Objectives

      After successfully completing this course, you should be able to:

      • Understand the need for SDN and which problems it can help solve.
      • Discuss SDN and its different implementations.
      • Identify the different areas where SDN can be utilized.
      • Describe the OpenFlow protocol.
      • Discuss different real-world implementations of SDN.
      • Describe a Contrail use case.

      Intended Audience

      The primary audiences for this course are the following:

      • Network engineers with the desire to learn about software defined networking; and
      • Engineers who are interested in hands-on experience with Juniper Contrail.

      Course Level

      JSDNF is an introductory-level course.

      Prerequisites

      The following are the prerequisites for this course:

      • Fundamental understanding of network design; and
      • An understanding of network protocols such as BGP, OSPF, and STP.

      Course Contents

      Chapter 1: Introduction

      Chapter 2: Introduction to Software Defined Networking

      • Current Infrastructure Limitations
      • Advantages of Implementing SDN
      • SDN Camps and Definitions
      • SDN Versus NFV
      • Relevant Entities and Standards Bodies

      Chapter 3: SDN Applications

      • SDN in the Data Center
      • SDN in the Enterprise
      • SDN in the WAN
      • SDN in Transport Networks

      Chapter 4: OpenFlow

      • Northbound, Southbound and East/West Interfaces
      • OpenFlow Overview
      • OpenFlow Message Exchange

      Chapter 5: Juniper Contrail

      • Juniper Contrail
      • OpenStack
      • Lab: Introduction to Contrail

      Chapter 6: SDN Use Cases

      • Lab: Contrail Features

       

        

      Certification from TSSC (Telecom Sector Skill Council), a non-profit industry driven body set up under the aegis of the NSDC (National Skill Development Council), at the completion of the program.

      Objectives

      After successfully completing this course, you should be able to:

      • Understand the need for SDN and which problems it can help solve.
      • Discuss SDN and its different implementations.
      • Identify the different areas where SDN can be utilized.
      • Describe the OpenFlow protocol.
      • Discuss different real-world implementations of SDN.
      • Describe a Contrail use case.

      Intended Audience

      The primary audiences for this course are the following:

      • Network engineers with the desire to learn about software defined networking; and
      • Engineers who are interested in hands-on experience with Juniper Contrail.

      Course Level

      JSDNF is an introductory-level course.

      Prerequisites

      The following are the prerequisites for this course:

      • Fundamental understanding of network design; and
      • An understanding of network protocols such as BGP, OSPF, and STP.

      Course Contents

      Chapter 1: Introduction

      Chapter 2: Introduction to Software Defined Networking

      • Current Infrastructure Limitations
      • Advantages of Implementing SDN
      • SDN Camps and Definitions
      • SDN Versus NFV
      • Relevant Entities and Standards Bodies

      Chapter 3: SDN Applications

      • SDN in the Data Center
      • SDN in the Enterprise
      • SDN in the WAN
      • SDN in Transport Networks

      Chapter 4: OpenFlow

      • Northbound, Southbound and East/West Interfaces
      • OpenFlow Overview
      • OpenFlow Message Exchange

      Chapter 5: Juniper Contrail

      • Juniper Contrail
      • OpenStack
      • Lab: Introduction to Contrail

      Chapter 6: SDN Use Cases

      • Lab: Contrail Features

       

        

      Certification from TSSC (Telecom Sector Skill Council), a non-profit industry driven body set up under the aegis of the NSDC (National Skill Development Council), at the completion of the program.

      Cisco Certified Network Associate (CCNA) Routing and Switching is a certification program for entry-level network engineers that helps maximize your investment in foundational networking knowledge and increase the value of your employer's network. CCNA Routing and Switching is for Network Specialists, Network Administrators, and Network Support Engineers with 1-3 years of experience. The CCNA Routing and Switching validates the ability to install, configure, operate, and troubleshoot medium-size routed and switched networks.

      Course Curriculum:

      • Introduction to Networks
        • Course Introduction 
        • Exploring the Network
        • Configuring a Network Operating System 
        • Network Protocols and Communications
        • Network Access 
        • Ethernet
        • Network Layer 
        • Transport Layer
        • IP Addressing 
        • Subnetting IP Networks
        • Application Layer 
        • It’s a Network
      • Routing and Switching Essentials
        • Course Introduction
        • Introduction to Switched Networks (Text Only Version)
        • Basic Switching Concepts and Configuration
        • VLANs (Text Only Version)
        • Routing Concepts
        • Inter-VLAN Routing (Text Only Version)
        • Static Routing
        • Routing Dynamically (Text Only Version)
        • Single-Area OSPF
        • Access Control Lists (Text Only Version)
        • DHCP
        • Network Address Translation for IPv4
      • Scaling Networks
        • Course Introduction (Text Only Version)
        • Introduction to Scaling Networks
        • LAN Redundancy (Text Only Version)
        • Link Aggregation
        • Wireless LANs (Text Only Version)
        • Adjust and Troubleshoot Single-Area OSPF
        • Multiarea OSPF (Text Only Version)
        • EIGRP
        • EIGRP Advanced Configurations and Troubleshooting (Text Only Version)
        • IOS Images and Licensing
      • Connecting Networks
        • Course Introduction
        • Hierarchical Network Design (Text Only Version)
        • Connecting to the WAN
        • Point-to-Point Connections (Text Only Version)
        • Frame Relay
        • Network Address Translation for IPv4 (Text Only Version)
        • Broadband Solutions
        • Securing Site-to-Site Connectivity (Text Only Version)
        • Monitoring the Network
        • Troubleshooting the Network

      Demo Course_Handset Repair Engineer

      This program provides insight to participants on business applications and use cases in different functional areas for Government and various industries like telecom, retail, ecommerce, automobile, hospitality, airline, healthcare, banking & finance etc. 

      Use cases of Big Data and Data Science

      Spark, an alternative for fast data analytics
      Although Hadoop captures the most attention for distributed data analytics, there are alternatives that provide some interesting advantages to the typical Hadoop platform. Spark is a scalable data analytics platform that incorporates primitives for in-memory computing and therefore exercises some performance advantages over Hadoop's cluster storage approach. Spark is implemented in and exploits the Scala language, which provides a unique environment for data processing. Get to know the Spark approach for cluster computing and its differences from Hadoop.

      Spark is an open source cluster computing environment similar to Hadoop, but it has some useful differences that make it superior in certain workloads—namely, Spark enables in-memory distributed datasets that optimize iterative workloads in addition to interactive queries.

      Spark is implemented in the Scala language and uses Scala as its application framework. Unlike Hadoop, Spark and Scala create a tight integration, where Scala can easily manipulate distributed datasets as locally collective objects.

      Although Spark was created to support iterative jobs on distributed datasets, it's actually complementary to Hadoop and can run side by side over the Hadoop file system. This behavior is supported through a third-party clustering framework called Mesos. Spark was developed at the University of California, Berkeley, Algorithms, Machines, and People Lab to build large-scale and low-latency data analytics applications.


      Spark cluster computing architecture
      Although Spark has similarities to Hadoop, it represents a new cluster computing framework with useful differences. First, Spark was designed for a specific type of workload in cluster computing—namely, those that reuse a working set of data across parallel operations (such as machine learning algorithms). To optimize for these types of workloads, Spark introduces the concept of in-memory cluster computing, where datasets can be cached in memory to reduce their latency of access.

      Spark also introduces an abstraction called resilient distributed datasets (RDDs). An RDD is a read-only collection of objects distributed across a set of nodes. These collections are resilient, because they can be rebuilt if a portion of the dataset is lost. The process of rebuilding a portion of the dataset relies on a fault-tolerance mechanism that maintains lineage (or information that allows the portion of the dataset to be re-created based on the process from which the data was derived). An RDD is represented as a Scala object and can be created from a file; as a parallelized slice (spread across nodes); as a transformation of another RDD; and finally through changing the persistence of an existing RDD, such as requesting that it be cached in memory.

      Applications in Spark are called drivers, and these drivers implement the operations performed either on a single node or in parallel across a set of nodes. Like Hadoop, Spark supports a single-node cluster or a multi-node cluster. For multi-node operation, Spark relies on the Mesos cluster manager. Mesos provides an efficient platform for resource sharing and isolation for distributed applications (see Figure 1). This setup allows Spark to coexist with Hadoop in a single shared pool of nodes.
      Figure 1. Spark relies on the Mesos cluster manager for resource sharing and isolation.

      Image showing the relationship among Mesos and Spark for resource sharing and isolation

      Spark programming model
      A driver can perform two types of operations on a dataset: an action and a transformation. An action performs a computation on a dataset and returns a value to the driver; a transformation creates a new dataset from an existing dataset. Examples of actions include performing a Reduce operation (using a function) and iterating a dataset (running a function on each element, similar to the Map operation). Examples of transformations include the Map operation and the Cache operation (which requests that the new dataset be stored in memory).

      Course Content:

      • Introduction to Data Analysis with Spark
      • L1. Installation and running Spark.
        1. What is Apache Spark?
        2. Downloading and installation Spark.
        3. The first step to a Spark program in Python, Scala and Java.
      • L2. RDD operations.
        1. Initializing a SparkContext.
        2. Loading and saving data into an RDD.
        3. RDD Basics.
        4. Passing Functions to Spark.
        5. Common Transformations and Actions.
        6. Working with Key-Value Pairs.
      • Laboratory exercise 1.
      • Basic operations with RDD.
      • Spark SQL and Spark Streaming
      • L3. Programmatically Specifying the Schema.
        1. Initializing Spark SQL.
        2. Basic Query Example.
        3. SchemaRDDs.
      • L4. Basic Concepts of Spark Streaming.
        1. Initializing StreamingContext.
        2. Stateless and stateful transformations.
        3. Core Sources.
        4. Additional Sources.
      • Laboratory exercise 2.
      • SQL operations and log files analysis.
      • Machine Learning with MLlib
      • L5. Obtaining, Processing, and Preparing Data with Spark.
        1. Data Types.
        2. Exploring and visualizing data.
        3. Processing and transforming data.
        4. Extracting useful features from data.
        5. Basic statistics.
      • L6. Classification and regression with Spark MLlib.
        1. Regression models and Classifiers.
        2. Naive Bayes.
        3. Advanced Text Processing.
      • Laboratory exercise 3.
      • Texts classification.
      • Spark Graph Processing
      • L7. The Property Graph.
        1. Getting Started.
        2. Example Property Graph.
        3. Graph Operators.
      • L8. Graph Algorithms.
        1. PageRank.
        2. Connected Components.
        3. Triangle Counting.
      • Laboratory exercise 4.
      • Social Network Graph building.

      Statistics and Probability for Data Analysis

      Decision making process under uncertainty is largely based on application of statistical data analysis for probabilistic risk assessment of your decision. Managers need to understand variation for two key reasons. First, so that they can lead others to apply statistical thinking in day to day activities and secondly, to apply the concept for the purpose of continuous improvement. This course will provide you with hands-on experience to promote the use of statistical thinking and techniques to apply them to make educated decisions whenever there is variation in business data.

      This course exposes you to elementary statistics and probability concepts that provide the necessary foundation for application in data analysis.

      Statistics         

      •             Dataset
      •             Descriptive
      •             Percentiles
      •             Inferential
      •             Scaling
      •             Normalization
      •             Transformation
      •             Outliers Detection/Fixation
      •             Sparseness Elimination
      •             Sampling
      •             Exploratory Data Analysis
      •             ANOVA
      •             Correlation Analysis
      •             Feature Selection
      •             Uncorrelated Feature Elimination

      Probability      

      •             Probability Theory
      •             Distribution
      •             Hypothesis Testing
      •             Significant (p-) value
      •             Maximum Likelihood
      •             Monte Carlo
      •             Regression Analysis
      •             Time Series

      R for Business Analytics

      During the last decade, the momentum coming from both academia and industry has lifted the R programming language to become the single most important tool for computational statistics, visualization and data science. Worldwide, millions of statisticians and data scientists use R to solve their most challenging problems in fields ranging from computational biology to quantitative marketing. R has become the most popular language for data science and an essential tool for Finance and analytics-driven companies such as Google, Facebook, and LinkedIn.

      Every data analysis technique at your fingertips

      R includes virtually every data manipulation, statistical model, and chart that the modern data scientist could ever need. You can easily find, download and use cutting-edge community-reviewed methods in statistics and predictive modeling from leading researchers in data science, free of charge.

      Create beautiful and unique data visualizations

      Representing complex data with charts and graphs is an essential part of the data analysis process, and R goes far beyond the traditional bar chart and line plot. Heavily influenced by thought leaders in data visualization like Bill Cleveland and Edward Tufte, R makes it easy to draw meaning from multidimensional data with multi-panel charts, 3-D surfaces and more. The custom charting capabilities of R are featured in many of the stunning infographics seen in the New York Times, The Economist, and the FlowingData blog.

      Get better results faster

      Instead of using point-and-click menus or inflexible "black-box" procedures, R is a programming language designed expressly for data analysis. Intermediate level R programmers create data analyses faster than users of legacy statistical software, with the flexibility to mix-and-match models for the best results. And R scripts are easily automated, promoting both reproducible research and production deployments.

      Draw on the talents of data scientists worldwide

      As a thriving open-source project, R is supported by a community of more than 2 million users and thousands of developers worldwide. Whether you're using R to optimize portfolios, analyze genomic sequences, or to predict component failure times, experts in every domain have made resources, applications and code available for free, online.

      Course

      This courses introduces you to the R statistical processing language, including how to install R on your computer, read data from SPSS and spreadsheets, and use packages for advanced R functions.

      The course continues with examples on how to create charts and plots, check statistical assumptions and the reliability of your data, look for data outliers, and use other data analysis tools. Finally, learn how to get charts and tables out of R and share your results with presentations and web pages.

      Course Content

      • What is R?
      • Installing R
      • Data Types
      • Import/Export Data
      • Operators
      • Built-in Functions
      • Control Structures
      • User-defined Functions
      • Data Management
      • Debugging

      Machine Learning and Data Mining

      This Courses offers a thorough grounding in machine learning concepts as well as practical advice on applying machine learning tools and techniques in real-world data mining situations. This will teach you everything you need to know about preparing inputs, interpreting outputs, evaluating results, and the algorithmic methods at the heart of successful data mining.

      Course Content/Syllabus of Machine Learning and Data Mining

      • Introduction
        • Machine Learning Basics
        • Data Mining Basics
        • Supervised and Unsupervised Learning
      • Regression Analysis
        • Linear Regression
        • Logistic Regression
      • Regularization
        • Bias
        • Variance
        • Over-fitting
      • Classifiers
        • Decision Trees & Random Forest
        • k-Nearest Neighbor
        • Support Vector Machines
      • Clustering
        • K-Means Clustering
      • Applying Machine Learning
        • Model Selection
        • Train/Test/Validation sets
        • Learning curves
      • Machine Learning Model Design
        • Error Analysis
        • Error Metric
        • Precision and Recall
      • Dimensionality Reduction
        • Principal Component Analysis

      Online Weekend Executive Post Graduate Program (EPGP) in Data Science, Business Analytics & Big Data in association with IBM
      A True Holistic Data Science Program


      "There were 5 exabytes of information created between the dawn of civilization through 2003, but that much information is now created in every 2 days" 
      - Eric Schmidt, Executive Chairman, Google

      Every day, we create 2.5 quintillion bytes of data — so much that 90% of the data in the world today has been created in the last two years alone. This data comes from everywhere: sensors used to gather climate information, posts to social media sites, digital pictures and videos, purchase transaction records, and cell phone GPS signals to name a few. This data is big data and mainly constitutes the unstructured data. This Big Data offers challenge in term of storage and further analysis in rest of in real time. One can dig gold mine if we are able to make sense out of big data.

      In a global and highly interconnected world we live in, the one greatest power is the right information. Data Science, Big Data and Business Analytics is thus the primary tool for any organisation, society, Govt. to seek for competitive advantage and optimisation of the existing ecosystem. A career in Business Analytics, Big Data and Data Science promise a lot of challenge, opportunity, and reward.

      Modern Data Scientists are Supermen with Crystal Ball

      Data Scientists are super men who tell stories out of the data, respective of the size of data: big or small. They answer questions like what’s happening. What will happen? What we should do now? These are commonly termed as BI, Predictive analytics and Prescriptive Analytics. They use statistics, Machine learning, NLP, R, Python, Hadoop, Spark to create Crystal Ball to predict future, solve business problem or find out the missing opportunities. Snapdeal, which has over 20 million subscribers and generates terabytes of data through the interactions that happen with customers in addition to a catalogue of over 5 million, churns 15 million data points (related data set like a consumer shopping on specific days for a particular thing) within two hours, using Hadoop. About 35% of its orders come from recommendation and personalized systems, and the conversion rate of such orders is 20-30% higher than normal orders, the company claims.

      Post Graduate Program in Data Science, Business Analytics and Big Data  (PGP-DS-BA-BigData)

      Post Graduate Program in Data Science, Business Analytics and Big Data  (PGP-DS-BA-BigData) is India’s first high end data science program designed and delivered by Aegis School of Business, Data Science & Telecommunication in association with IBM and to train the new generation of data-savvy professionals. This 11 months program provides you intensive hands-on training to develop the necessary and unique set of skills required for successful career in the fastest growing and intellectually stimulating fields of Data Science, Big Data, Business Analytics, Predictive Analytics, NLP, ML and Cognitive Computing. Students who earn a PGP develop deep quantitative capabilities and technical expertise and are equipped to create business and social value by extracting useful insights and applying it in a various industries by playing a role of Data Scientist, Big Data Adviser and modern Business Analyst.

      This program is delivered at Centre for Excellence in Telecom Technology and Management (CETTM), MTNL's world class campus with state of art "IBM Business Analytics Lab" and "IBM Cloud Computing Lab" at Powai in Mumbai.

       

      Sheetal Soni, Country Channel Manager, IBM India speaking on PGP in Business Analytics and Big Data in association with Aegis

      Aegis School of Business, Data Science & Telecommunication 
      Aegis is a leading School in the field of Telecom Management/Technology Management started in 2002 with support of Bharti Airtel to develop future telecom leaders. Aegis offers various programs in 25 countries to top executives from top IT/Telecom firms. In 2015 Aegis has joined hands with IBM to offer high end courses in the field of Business Analytics, Big Data, Cloud Computing and Mobility. Check about Aegis at www.aegis.edu.in  and www.bellaward.com

      IBM

      To meet growing client demands in today’s data intensive industries, IBM has established the world’s deepest and broadest portfolio of Big Data and Analytics technologies and solutions, spanning services, software, research and hardware. Fueling this is the fact that IBM has made a multi-billion dollar investment in big data and analytics — $24 billion since 2005 through both acquisitions and R&D — to extend its leadership position in this strategic market. IBM Named A Leader In IDC MarketScape For Business Analytics practice. 

      Why PGP/MS in Data Science, Business Analytics and Big Data program? 

      Aegis is a leader in Data Science, Business Analytics and Big Data. Read what makes Aegis PGP in Business Analytics & Big Data, First Data Science Program of India In association with IBM the best program in India?

      Highlights of the Program

      PGP program Certification from IBM

      Certification from IBM, world leader in Big Data Analytics at the completion of the program.

      Best Data Science curriculum

      Program curriculum designed as per the Data Scientist’s skills and competencies framework with the help of leading Data Scientists. This program has wide rage of core and elective courses which no other institute aound the world offers. Check list of courses at bottom

      Ranked Among Top Ten Business Anlytical Courses in India

      Ranked among Top Ten programs for Business Anlaytics programs in India by Analytics Magzine. Check the ranking

      True Data Science Program and NOT an MBA/PGDBM in Business Anlytics

      This Program is core technology MS in Data Science program and NOT a inferior cousin of MBA/PGDBM programs in Business Analytics. This program lays down your foundation for serious Developer, Data Scientist Career or Big Data Advisory & consulting roles.

      IBM Business Analytics Lab

       IBM has setup an IBM Business Analytics and IBM Cloud Computing Lab in the campus. The program is delivered by IBM subject matter experts and best Data Scientists from around the world.

      Big Data Product Factory

      Live Projects: work on live projects like Churn Predication, Call drop analysis from leading teleco; NLP projects, consumer analytics, sentiment analysis etc at Aegis Big Data Product Factory as part of program.

      Exposure to wide range of Tech, Tools and S/W

       Hands on exposure on IBM DB2, IBM Cognos TM1, IBM Cognos Insight, IBM InfoSphere Big Insight, IBM Worklight, IBM BlueMix, R, Python, SAS, Hadoop, MapReduce, Spark, ElasticSearch, Tableau, EMR, AWS etc

      World class infrastructure

      On campus classes held at world class infrastructure at Aegis, CETTM, MTNL at Powai, Mumbai. 

      CMC, Internship and Placement

      Career Management Center (CMC) at Aegis facilitates all students excellent paid internship for 2 to 3 months with various companies which generally leads to final placement as role of Data Scientist, Manager Data Science, Business Analyst, Risk Analyst etc with companies like IBM, Persistent, HDFC, Loginext, Angelbrooking, Creditvidya, L&T infotech, Virtusa, Ixsight, Pentation, Clover Infotech, L & T Finance, Guppy Media Incorporation etc. This years PGP in Data Science, Business Analytics and Big placement record:

      Fresher Lowest Package: 8.5 lacs
      Fresher highest Package: 11 lacs 
      Experience Candidates got over 100% hike on last Package. *For experience up to 7 to 8 years.
      90% got Data Scientist Role
      Paid internship for everyone for 2 to 3 months 
      *Aegis does not provide any kind of job guarantee

      Course Modules taught by IBM

      IBM faculty will be teaching three courses: (i) Business Intelligence using Cognos BI (ii) Big Data Analytics using IBM InfoSphere Big Insight and (iii) Enterprise Performance Management using IBM Cognos TM1

      Scholarships

      Deans Scholarship for Women in Technology Leadership; Aegis Graham Bell Award Scholarship for Big Data Products; Data Science Scholar. Study Loan by HDFC Credila.

      Globally acceptable Credit Structure

      This program follows Globally acceptable 45 Credit Unit for master degree. In North America any master degree is min 36 Credit Unit. This program is spread over 9 months plus 2-3 months Internship or consulting assignment.

      Skill for various industries & Functional areas

      The curriculum caters to the skill requirements in various industries like Ecommerce, Telecom, Banks, Computer Services, Education, Healthcare, Insurance, Manufacturing, Retail and other industries. 

      Diverse student profile

      Aegis participants' are from around the world with experience ranging from 2 year up to 25 years from diverse industry and background.

      DSC_3045

      Center of Innovation

      Due to Aegis's initiative of Aegis Graham Bell Awards for innovation in TIME (Telecom, Internet, Media, Edutainment) and SMAC (Social, Mobile, Analytics and Cloud), Aegis students get to know the best of the innovations and their innovators. Hon'ble Dr. Jitender Singh, Minister of State in the Prime Ministers Office; Department of Atomic Energy and Department of Space honored the winners of Aegis Graham Bell Awards 2015 for their breakthrough innovations. Check the winners under Analytics and Big Data categories. 

      Delivered by Best Data Scientist

      This program is delivered by best Data Scientist engaged in real-life Data Science and Big Data Analytics projects from around the world. Check profiles....

      World Class LMS on Cloud

      Aegis uses world class Learning Management System (LMS) on cloud called mUniversity(mUni). All lectures, tons of study material, ebooks, business cases, video lectures available on LMS. Check sample session of Mr. Srikanth Velamakanni, CEO, Fractal Analytics at mUni......

      Big Data Day leadership Series

      Aegis Big Data Day Meetup provides an opportunity to meet Data Science, Big Data, Analytics leaders. Register for latest Meetup

            

       Download Aegis Business Analytics and Big Data Brochure

      Is Business Analytics, Big Data and Data Science for me?

      If you enjoy working data in different ways, exploring the stories that data may reveal, and even experimenting with visualization techniques, you may enjoy a role in data science. Whether you have an academic background in business, economics, technology, statistics, mathematics, or engineering, you are looking for a career change that will prepare you to generate high impact. You search for a challenging role that allows you to integrate the value of data into all business functions. Data analysis is necessary in almost every field and industry and Business Analysts, Big Data professionals and Data Scientists are expected to play a role in guiding strategic decisions. They are in a unique position to assist leaders in determining the right questions to ask as well as in interpreting the answers that data provides to help determine the organization's most effective courses of action.

      Job Opportunities 

      While there is a large and growing demand for professionals trained to collect data and use analytics to drive performance, there is currently a deficit in the supply of these individuals in the global job market. The deficit in the market for data/ Big Data analytics professionals is growing exponentially and poses a serious challenge for recruiters, but it is an enviable opportunity for professionals who have the right skills and training to fill these positions.

      Program Models:

      Full Time Post Graduation Program (OnCampus)

    • Total Program: 36 Credit Units

    • Program Duration: 11 months

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development + 2 to 3 months of Internships

    • 3 terms each of 3 months

      View details and Apply

    •  

      Part Time Executive Program in Weekend model (OnCampus)

    • Total Program: 36 Credit Units

    • Program Duration: 11 Months

    • Sat and Sun on-campus classes in Powai spread over 9 months

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment 

      View details and Apply

    •  

      Part Time Executive Program in Hybrid model (OnCampus + Online)

    • Total Program: 36 Credit Units

    • Program Duration: 11 Months

    • One Block of 5 days in every 3 months plus Sat and Sun online classes spread over 11 months 

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment 

      View details and Apply 

    •           

      Part Time Executive Program in Online model (Online Live Interactive)

    • Total Program: 36 Credit Units

    • Program Duration: 11 Months

    • Live interactive Classes on Saturday and Sunday

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment      

      View details and Apply

    •    

       Download Aegis Business Analytics and Big Data Brochure

      Eligibility Criteria for Admissions: 

      - Working executive with bachelor's degree.

      Admission process:

      • Online application (click on Apply button on top of this page to register and apply)
      • Personal Interview
      • admission offer (if selected)

      Get in Touch with Aegis Big Data Career Adviser Team at  +91 704 531 4371/ +91 882 808 4908/ +91 902 213 7010
      You can also email your Resume to admission@aegis.edu.in and request a call back for consultation 

      Scholarships: 

      Dean’s Scholarship for Women in Technology Leadership

      Saluting the women leaders in Technology, Aegis has announced the “Dean’s Scholarship for Women in Technology Leadership”  

      Who should apply: 

      Aspiring Women Leaders. Please Check Eligibility Criteria above.  

      What's you get:

      • 100% scholarship to study Full Time PGP in Business Analytics and Big Data Program 
      • Learn and work on Big Data Product Factory
      • Come out as Data Scientist or Entrepreneur

      Data Science Scholar

      Who should apply:

      • Phd, Mtech, Msc, MS in Computer science/Physics/Maths or statistics/Economics
      • or If you have developed some data product 

      What's you get:

      • 100% scholarship to study Full Time PGP in Business Analytics and Big Data Program 
      • Learn and work on Big Data Product Factory
      • Job offer to work as Data Scientist ‎at Big Data Product Factory and Aegis ( this is optional )

      Aegis Graham Bell Award Scholarship for Big Data Products for startups

      What you get: 

      Scholarship 
      For Executive PGP in
      Business Analytics & Big Data in association with IBM

      Workspace
      At Aegis Big Data Product Factory
      in Mumbai.

      Manpower
      Access to Highly Skilled Big Data Manpower to work on your Product/Solution.

      Mentorship
      From Business Leaders, Data Scientists & Functional Heads

      Recognition
      Get Recognized at The Largest Innovation Award in TIME & SMAC
      'Aegis Graham Bell Awards'.

      Networking
      Opportunity to network with Business leaders, Functional Heads, Peers & VC's.

      POC
      Help in POC

      Funding
      Help in fund raising and introduction with VC's 

      Who should apply:

      • Start up firms 
      • Individual who have developed some data product 
      • If you have a break through idea on Big Data product/ solution/ platform and you want to develop it   
      • check also www.bellaward.com

      How to Apply for Scholarship: Click on the top apply button and submit application form. Admissions and Scholarship application is same. 

      Aegis School Of Business, Data Science & Telecommunication
      CETTM, MTNL, Technology Street, | Hiranandani Gardens, Powai, Mumbai – 400076, India
      Mahesh Block –B, Sector 15, Central Business District (CBD), Belapur, New Mumbai 400614, India
      www.aegis.edu.in 

      Python is a powerful, flexible, open-source language that is easy to learn, easy to use, and has powerful libraries for data manipulation and analysis. This is among the top three along with R and SAS for analytics application. With origination as an open source scripting language, Python usage has grown over time. Today, it sports libraries (numpy, scipy and matplotlib) and functions for almost any statistical operation / model building you may want to do. Since introduction of pandas, it has become very strong in operations on structured data. Its simple syntax is very accessible to programming novices, and will look familiar to anyone with experience in Matlab, C/C++, Java, or Visual Basic. Python has a unique combination of being both a capable general-purpose programming language as well as being easy to use for analytical and quantitative computing.

      For over a decade, Python has been used in scientific computing and highly quantitative domains such as finance, oil and gas, physics, and signal processing. It has been used to improve Space Shuttle mission design, process images from the Hubble Space Telescope, and was instrumental in orchestrating the physics experiments which led to the discovery of the Higgs Boson (the so-called "God particle"). At the same time, Python has been used to build massively scalable web applications like Youtube, and has powered much of Google's internal infrastructure. Python is easy for analysts to learn and use, but powerful enough to tackle even the most difficult problems in virtually any domain. It integrates well with existing IT infrastructure, and is very platform independent. Among modern languages, its agility and the productivity of Python-based solutions is legendary. Companies of all sizes and in all areas - from the biggest investment banks to the smallest social/mobile web app startups - are using Python to run their business and manage their data.

      The course on the Python programming language and standard library with a focus on applying Python to problems in scripting, data analysis, and systems programming. This course assumes no prior experience with Python, but does assume participants know how to program in another programming language. It is especially well-suited for programmers who want to know what Python is all about without extra fluff.

      Course Content:

      • Strings and Lists
      • Loops
      • Functions
      • Tuples, Dictionaries and Files
      • Regular Expressions
      • NLTK

      Information retrieval

      It is the activity of obtaining information resources relevant to an information need from a collection of information resources. Searches can be based on metadata or on full-text (or other content-based) indexing.
      Automated information retrieval systems are used to reduce what has been called "information overload". Many universities and public libraries use IR systems to provide access to books, journals and other documents. Web search engines are the most visible IR applications.

      Overview :

      An information retrieval process begins when a user enters a query into the system. Queries are formal statements of information needs, for example search strings in web search engines. In information retrieval a query does not uniquely identify a single object in the collection. Instead, several objects may match the query, perhaps with different degrees of relevancy. An object is an entity that is represented by information in a database. User queries are matched against the database information. Depending on the application the data objects may be, for example, text documents, images, audio, mind maps or videos. Often the documents themselves are not kept or stored directly in the IR system, but are instead represented in the system by document surrogates or metadata. Most IR systems compute a numeric score on how well each object in the database matches the query, and rank the objects according to this value. The top ranking objects are then shown to the user. The process may then be iterated if the user wishes to refine the query.

      Course Curriculum
      The course is organized in 24 lessons (4 hours per week ) in batches of two. It contains an overview of information retrieval theory and a practical part of using elasticsearch platform.
      Week 1 (Introduction):

      • Introduction to information retrieval;
      • Example use cases;
      • Introduction to large web search and problems
      • Unstructured data
      • Boolean retrieval example

      Week 2 (models I):

      • Term weighting
      • Vector space modeling (cosine distance)
      • Okapi bm25
      • Probabilistic models (KLdivergence)

      Week 3 (models II+data preprocessing):

      • Topic modeling
      • Text classification
      • Linguistic processing (tokenization, nlp theory and resources (stemmers,
      • Wordnet, NER resolvers)

      Week 4 (elasticsearch+project assignment):

      • Elasticsearch installing description
      • Hands on tutorial
      • Project assignment

      Week 5 (web IR):

      • HITs, pagerank
      • Web search engine dissection (crawlers, scrapers)

      Week 6 (IR evaluation + advanced topics/project discussions):

      • Precision/recall
      • DCG
      • Duplicate detection
      • Text summarization
      • Any student discussions.

      Post Graduate Program (PGP) in Business Analytics & Big Data in association with IBM

      A True Holistic Data Science Program 


      "There were 5 exabytes of information created between the dawn of civilization through 2003, but that much information is now created in every 2 days"
      - Eric Schmidt, Executive Chairman, Google

      Every day, we create 2.5 quintillion bytes of data — so much that 90% of the data in the world today has been created in the last two years alone. This data comes from everywhere: sensors used to gather climate information, posts to social media sites, digital pictures and videos, purchase transaction records, and cell phone GPS signals to name a few. This data is big data and mainly constitutes the unstructured data. This Big Data offers challenge in term of storage and further analysis in rest of in real time. One can dig gold mine if we are able to make sense out of big data.

      In a global and highly interconnected world we live in, the one greatest power is the right information. Data Science, Big Data and Business Analytics is thus the primary tool for any organisation, society, Govt. to seek for competitive advantage and optimisation of the existing ecosystem. A career in Business Analytics, Big Data and Data Science promise a lot of challenge, opportunity, and reward.

      Modern Data Scientists are Supermen with Crystal Ball

      Data Scientists are super men who tell stories out of the data, respective of the size of data: big or small. They answer questions like what’s happening. What will happen? What we should do now? These are commonly termed as BI, Predictive analytics and Prescriptive Analytics. They use statistics, Machine learning, NLP, R, Python, Hadoop, Spark to create Crystal Ball to predict future, solve business problem or find out the missing opportunities. Snapdeal, which has over 20 million subscribers and generates terabytes of data through the interactions that happen with customers in addition to a catalogue of over 5 million, churns 15 million data points (related data set like a consumer shopping on specific days for a particular thing) within two hours, using Hadoop. About 35% of its orders come from recommendation and personalized systems, and the conversion rate of such orders is 20-30% higher than normal orders, the company claims.

      Post Graduate Program in Business Analytics and Big Data  (PGP-BA-BigData)

      Post Graduate Program in Business Analytics and Big Data  (PGP-BA-BigData) is India’s first high end data science program designed and delivered by Aegis School of Business & Telecommunication in association with IBM and to train the new generation of data-savvy professionals. This 11 months program provides you intensive training to develop the necessary and unique set of skills required for a successful career in the world of Big Data and Business Analytics. Students who earn a PGP in Business Analytics and Big Data develop deep quantitative capabilities and technical expertise and are equipped to create business and social value by extracting gained useful insights and applying them in a various industries. 

      This program is delivered at Centre for Excellence in Telecom Technology and Management (CETTM), MTNL's world class campus with state of art "IBM Business Analytics Lab" and "IBM Cloud Computing Lab" at Powai in Mumbai.

      Aegis School of Business, Data Science & Telecommunication 
      Aegis is a leading School in the field of Telecom Management/Technology Management started in 2002 with support of Bharti Airtel to develop future telecom leaders. Aegis offers various programs in 25 countries to top executives from top IT/Telecom firms. In 2015 Aegis has joined hands with IBM to offer high end courses in the field of Business Anlytics, Big Data, Cloud Computing and Mobility. MTNl, a leading Govt of India telecom service provider, is Aegis Infrastructure partner in Mumbai and IBM has setup the IBM Business Analytics Lab in MTNL's world-class infrastructure in Powai, Mumbai, India. 

      Check about Aegis at www.aegis.edu.in  and www.bellaward.com

      IBM

      To meet growing client demands in today’s data intensive industries, IBM has established the world’s deepest and broadest portfolio of Big Data and Analytics technologies and solutions, spanning services, software, research and hardware. Fueling this is the fact that IBM has made a multi-billion dollar investment in big data and analytics — $24 billion since 2005 through both acquisitions and R&D — to extend its leadership position in this strategic market. IBM Named A Leader In IDC MarketScape For Business Analytics practice. 

      Why PGP/MS in Data Science/ Busiess Analytics and Big Data program? 

      Aegis is a leader in Data Science, Business Analytics and Big Data. Read what makes Aegis PGP in Business Analytics & Big Data, First Data Science Program of India In association with IBM the best program in India?

      Highlights of the Program:

      PGP program Certification from IBM

      Certification from IBM, world leader in Big Data Analytics at the completion of the program.

      Best Data Science curriculum

      Program curriculum designed as per the Data Scientist’s skills and competencies framework with the help of leading Data Scientists. This program has wide rage of core and elective courses which no other institute aound the world offers. Check list of courses at bottom

      Ranked Among Top Ten Business Anlytical Courses in India

      Ranked among Top Ten programs for Business Anlaytics programs in India by Analytics Magzine. Check the ranking

      True Data Science Program and NOT an MBA/PGDBM in Business Anlytics

      This Program is core technology MS in Data Science program and NOT a inferior cousin of MBA/PGDBM programs in Business Analytics. This program lays down your foundation for serious Developer, Data Scientist Career or Big Data Advisory & consulting roles.

      IBM Business Analytics Lab

       IBM has setup an IBM Business Analytics and IBM Cloud Computing Lab in the campus. The program is delivered by IBM subject matter experts and best Data Scientists from around the world.

      Big Data Product Factory

      Live Projects: work on live projects like Churn Predication, Call drop analysis from leading teleco; NLP projects, consumer analytics, sentiment analysis etc at Aegis Big Data Product Factory as part of program.

      Exposure to wide range of Tech, Tools and S/W

       Hands on exposure on IBM DB2, IBM Cognos TM1, IBM Cognos Insight, IBM InfoSphere Big Insight, IBM Worklight, IBM BlueMix, R, Python, SAS, Hadoop, MapReduce, Spark, ElasticSearch, Tableau, EMR, AWS etc

      World class infrastructure

      On campus classes held at world class infrastructure at Aegis, CETTM, MTNL at Powai, Mumbai. 

      Career Management Center (CMC)

      CMC at Aegis facilitates graduate students, alumni and prospective employers with resources and opportunities in order to successfully match the student skills and interests with employer needs. CMC placed all the PGP in BA BD students for internship within a week in leading firms like IBM, Loginext, Angelbroking, Gramener, S S Userworks, K-rises etc

      Course Modules taught by IBM

      IBM faculty will be teaching three courses: (i) Business Intelligence using Cognos BI (ii) Big Data Analytics using IBM InfoSphere Big Insight and (iii) Enterprise Performance Management using IBM Cognos TM1

      ​Scholarships

      Deans Scholarship for Women in Technology Leadership; Aegis Graham Bell Award Scholarship for Big Data Products; Data Science Scholar. Study Loan by HDFC Credila.

      Globally acceptable Credit Structure

      This program follows Globally acceptable 45 Credit Unit for master degree. In North America any master degree is min 36 Credit Unit. This program is spread over 9 months plus 2-3 months Internship or consulting assignment.

      Skill for various industries

      The curriculum caters to the skill requirements in various industries like Ecommerce, Telecom, Banks, Computer Services, Education, Healthcare, Insurance, Manufacturing, Retail and other industries.

      Diverse student profile

      Aegis participants' are from around the world with experience ranging from 2 year up to 25 years from diverse industry and background.

      Center of Innovation

      Due to Aegis's initiative of Aegis Graham Bell Awards for innovation in TIME (Telecom, Internet, Media, Edutainment) and SMAC (Social, Mobile, Analytics and Cloud). Aegis students gets exposure to best of the innovations. Read...

      Delivered by Best Data Scientist

      This program is delivered by best Data Scientist engaged in real-life Data Science and Big Data Analytics projects from around the world. Check profiles....

      World Class LMS on Cloud

      Aegis uses world class Learning Management System (LMS) on cloud called mUniversity(mUni). All lectures, tons of study material, ebooks, business cases, video lectures available on LMS. Check sample session of Mr. Srikanth Velamakanni, CEO, Fractal Analytics at mUni......

      Big Data Day leadership Series

      Aegis Big Data Day Meetup provides an opportunity to meet Data Science, Big Data, Analytics leaders. Register for latest Meetup

            


       Download Aegis Business Analytics and Big Data Brochure

      Is Business Analytics, Big Data and Data Science for me?

      If you enjoy working data in different ways, exploring the stories that data may reveal, and even experimenting with visualization techniques, you may enjoy a role in data science. Whether you have an academic background in business, economics, technology, statistics, mathematics, or engineering, you are looking for a career change that will prepare you to generate high impact. You search for a challenging role that allows you to integrate the value of data into all business functions. Data analysis is necessary in almost every field and industry and Business Analysts, Big Data professionals and Data Scientists are expected to play a role in guiding strategic decisions. They are in a unique position to assist leaders in determining the right questions to ask as well as in interpreting the answers that data provides to help determine the organization's most effective courses of action.

      Job Opportunities 

      While there is a large and growing demand for professionals trained to collect data and use analytics to drive performance, there is currently a deficit in the supply of these individuals in the global job market. The deficit in the market for data/ Big Data analytics professionals is growing exponentially and poses a serious challenge for recruiters, but it is an enviable opportunity for professionals who have the right skills and training to fill these positions.

      Program Models:

      Full Time Post Graduation Program (OnCampus)

    • Total Program: 36 Credit Units

    • Program Duration: 11 months

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development + 2 to 3 months of Internships

    • 3 terms each of 3 months

      View details and Apply

    •  

      Part Time Executive Program in Weekend model (OnCampus)

    • Total Program: 36 Credit Units

    • Program Duration: 11 Months

    • Sat and Sun on-campus classes in Powai spread over 9 months

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment 

      View details and Apply

    •  

      Part Time Executive Program in Hybrid model (OnCampus + Online)

    • Total Program: 36 Credit Units

    • Program Duration: 11 Months

    • One Block of 5 days in every 3 months plus Sat and Sun online classes spread over 11 months 

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment 

      View details and Apply 

    •           

      Part Time Executive Program in Online model (Online Live Interactive)

    • Total Program: 36 Credit Units

    • Program Duration: 11 Months

    • Live interactive Classes on Saturday and Sunday

    • 9 months of Class room training + Live projects + One Capstone project or Data Product Development +  2-3 months industry internship/ project/ consulting assignment      

      View details and Apply

    •    

       Download Aegis Business Analytics and Big Data Brochure

      Eligibility Criteria for Admissions: 

      - Graduates in Engineering, Technology, Maths, Stats, Economic, Management (BE /BTech/BSc (Maths, IT/CS), BCA, Msc, MA, MCA) or experience in BI, Analytics, IT or S/W development.

      Admission process:

      • Online application (click on Apply button on top of this page to register and apply)
      • Personal Interview
      • admission offer (if selected)

      Get in Touch with Aegis Big Data Career Adviser Team at  +91 704 531 4371/ +91 882 808 4908/ +91 902 213 7010
      You can also email your Resume to admission@aegis.edu.in and request a call back for consultation 

      Scholarships: 

      Dean’s Scholarship for Women in Technology Leadership

      Saluting the women leaders in Technology, Aegis has announced the “Dean’s Scholarship for Women in Technology Leadership”  

      Who should apply:

      Aspiring Women Leaders. Please Check Eligibility Criteria above.  

      What's you get:

      • 100% scholarship to study Full Time PGP in Business Analytics and Big Data Program 
      • Learn and work on Big Data Product Factory
      • Come out as Data Scientist or Entrepreneur

      Data Science Scholar

      Who should apply:

      • Phd, Mtech, Msc, MS in Computer science/Physics/Maths or statistics/Economics
      • or If you have developed some data product 

      What's you get:

      • 100% scholarship to study Full Time PGP in Business Analytics and Big Data Program 
      • Learn and work on Big Data Product Factory
      • Job offer to work as Data Scientist ‎at Big Data Product Factory and Aegis ( this is optional )

      Aegis Graham Bell Award Scholarship for Big Data Products for startups

      What you get: 

      Scholarship 
      For Executive PGP in
      Business Analytics & Big Data in association with IBM

      Workspace
      At Aegis Big Data Product Factory
      in Mumbai.

      Manpower
      Access to Highly Skilled Big Data Manpower to work on your Product/Solution.

      Mentorship
      From Business Leaders, Data Scientists & Functional Heads

      Recognition
      Get Recognized at The Largest Innovation Award in TIME & SMAC

      'Aegis Graham Bell Awards'.

      Networking
      Opportunity to network with Business leaders, Functional Heads, Peers & VC's.

      POC
      Help in POC

      Funding
      Help in fund raising and introduction with VC's 

      Who should apply:

      • Start up firms 
      • Individual who have developed some data product 
      • If you have a break through idea on Big Data product/ solution/ platform and you want to develop it   
      • check also www.bellaward.com

      How to Apply for Scholarship: Click on the top apply button and submit application form. Admissions and Scholarship application is same. 

      Aegis School Of Business & Telecommunication
      CETTM, MTNL, Technology Street, | Hiranandani Gardens, Powai, Mumbai – 400076, India
      Mahesh Block –B, Sector 15, Central Business District (CBD), Belapur, New Mumbai 400614, India
      www.aegis.edu.in 

      Information retrieval

      It is the activity of obtaining information resources relevant to an information need from a collection of information resources. Searches can be based on metadata or on full-text (or other content-based) indexing.
      Automated information retrieval systems are used to reduce what has been called "information overload". Many universities and public libraries use IR systems to provide access to books, journals and other documents. Web search engines are the most visible IR applications.

      Overview :

      An information retrieval process begins when a user enters a query into the system. Queries are formal statements of information needs, for example search strings in web search engines. In information retrieval a query does not uniquely identify a single object in the collection. Instead, several objects may match the query, perhaps with different degrees of relevancy. An object is an entity that is represented by information in a database. User queries are matched against the database information. Depending on the application the data objects may be, for example, text documents, images, audio, mind maps or videos. Often the documents themselves are not kept or stored directly in the IR system, but are instead represented in the system by document surrogates or metadata. Most IR systems compute a numeric score on how well each object in the database matches the query, and rank the objects according to this value. The top ranking objects are then shown to the user. The process may then be iterated if the user wishes to refine the query.

      Course Curriculum
      The course is organized in 24 lessons (4 hours per week ) in batches of two. It contains an overview of information retrieval theory and a practical part of using elasticsearch platform.
      Week 1 (Introduction):

      • Introduction to information retrieval;
      • Example use cases;
      • Introduction to large web search and problems
      • Unstructured data
      • Boolean retrieval example

      Week 2 (models I):

      • Term weighting
      • Vector space modeling (cosine distance)
      • Okapi bm25
      • Probabilistic models (KLdivergence)

      Week 3 (models II+data preprocessing):

      • Topic modeling
      • Text classification
      • Linguistic processing (tokenization, nlp theory and resources (stemmers,
      • Wordnet, NER resolvers)

      Week 4 (elasticsearch+project assignment):

      • Elasticsearch installing description
      • Hands on tutorial
      • Project assignment

      Week 5 (web IR):

      • HITs, pagerank
      • Web search engine dissection (crawlers, scrapers)

      Week 6 (IR evaluation + advanced topics/project discussions):

      • Precision/recall
      • DCG
      • Duplicate detection
      • Text summarization
      • Any student discussions.

      Visualisation

      Visualisation is commonly known as Data visualisation is viewed by many disciplines as a modern equivalent of visual communication. It is not owned by any one field, but rather finds interpretation across many (e.g. it is viewed as a modern branch of descriptive statistics by some, but also as a grounded theory development tool by others). It involves the creation and study of the visual representation of data, meaning "information that has been abstracted in some schematic form, including attributes or variables for the units of information".

      A primary goal of data visualization is to communicate information clearly and efficiently to users via the information graphics selected, such as tables and charts. Effective visualization helps users in analyzing and reasoning about data and evidence. It makes complex data more accessible, understandable and usable. Users may have particular analytical tasks, such as making comparisons or understanding causality, and the design principle of the graphic (i.e., showing comparisons or showing causality) follows the task. Tables are generally used where users will look-up a specific measure of a variable, while charts of various types are used to show patterns or relationships in the data for one or more variables.

      Data visualization is both an art and a science. The rate at which data is generated has increased, driven by an increasingly information-based economy. Data created by internet activity and an expanding number of sensors in the environment, such as satellites and traffic cameras, are referred to as "Big Data". Processing, analyzing and communicating this data present a variety of ethical and analytical challenges for data visualization. The field of data science and practitioners called data scientists have emerged to help address this challenge.

      Course Content:

      • Data Exploration in R                                 
      • Line Chart
      • Scatter Plot
      • Histogram
      • Boxplot     
      • ggplot       
      • Tree Maps
      • D3.js
      • InfoVis

      Visualisation

      Visualisation is commonly known as Data visualisation is viewed by many disciplines as a modern equivalent of visual communication. It is not owned by any one field, but rather finds interpretation across many (e.g. it is viewed as a modern branch of descriptive statistics by some, but also as a grounded theory development tool by others). It involves the creation and study of the visual representation of data, meaning "information that has been abstracted in some schematic form, including attributes or variables for the units of information".

      A primary goal of data visualization is to communicate information clearly and efficiently to users via the information graphics selected, such as tables and charts. Effective visualization helps users in analyzing and reasoning about data and evidence. It makes complex data more accessible, understandable and usable. Users may have particular analytical tasks, such as making comparisons or understanding causality, and the design principle of the graphic (i.e., showing comparisons or showing causality) follows the task. Tables are generally used where users will look-up a specific measure of a variable, while charts of various types are used to show patterns or relationships in the data for one or more variables.

      Data visualization is both an art and a science. The rate at which data is generated has increased, driven by an increasingly information-based economy. Data created by internet activity and an expanding number of sensors in the environment, such as satellites and traffic cameras, are referred to as "Big Data". Processing, analyzing and communicating this data present a variety of ethical and analytical challenges for data visualization. The field of data science and practitioners called data scientists have emerged to help address this challenge.

      Course Content:

      • Data Exploration in R                                 
      • Line Chart
      • Scatter Plot
      • Histogram
      • Boxplot     
      • ggplot       
      • Tree Maps
      • D3.js
      • InfoVis

      Big Data Analytics using IBM InfoSphere Big Insight

      Enterprise Grade Hadoop

      IBM® BigInsights for Apache™ Hadoop® collects and economically stores a very large set of highly variable data. IBM BigInsights enhances open source Hadoop with the complete set of capabilities to query, visualize, explore data and conduct distributed machine learning at scale resulting in deeper insight and better actions.

      IBM InfoSphere BigInsights brings the power of Hadoop to the enterprise. Apache™ Hadoop® is the open source software framework, used to reliably manage large volumes of structured and unstructured data.

      InfoSphere BigInsights Enterprise Grade Hadoop empowers enterprises of all sizes to cost effectively manage and analyze big data – the massive volume, variety and velocity of data that consumers and businesses create every day. InfoSphere BigInsights helps increase operational efficiency by modernizing your data warehouse environment as query-able archive, allowing you to store and analyze large volumes of multi-structured data without straining the data warehouse.

      What’s inside IBM BigInsights?

      IBM BigInsights extends the core components of Hadoop for improved usability. Enterprise-scale features from IBM are added to deliver massive scale-out data processing and analysis with built-in resiliency and fault tolerance. Simplified administration and management capabilities, rich developer tools and powerful analytic functions reduce the complexity of Hadoop.

      What’s new

      IBM® BigInsights™ for Apache™ Hadoop® supports data science teams and business analysts with:

      IBM Open Platform with Apache Hadoop builds the platform for big data projects and provides the most current Apache Hadoop open source content.

      The Open Data Platform

      IBM is a platinum founding member of this shared industry initiative focused on accelerating enterprise Apache Hadoop® innovation.

      It’s based on open standards

      Includes a foundation of standard open-source Hadoop and includes the rich open-source components that Hadoop users expect.

      It’s enterprise-ready

       Includes several optional enterprise-grade features that users can choose to implement to accelerate the value of Hadoop.

      Hadoop for the data scientist


      Image: Hadoop Use Cases for the industry Machine Learning and mathematical algorithms have emerged as the missing piece needed to complement Hadoop to achieve faster time to value. IBM BigInsights for Hadoop puts the full range of analytics for Hadoop into the hands of Data Science teams.

      Course Overview

      This course is designed to aid business analysts who are working with IBM's InfoSphere BigInsights. Writing programs that extract data from unstructured text can be a daunting task. The student will learn how to create annotators through the use of IBM's Annotation Query Language (AQL). Analyzing data using Apache's Hadoop requires that map / reduce programs be written. People familiar with the Hadoop technology are aware of other open source products that are used in this environment. This course will give the student an overview of Apache Pig, ZooKeeper, and Map / Reduce and other Big Data components.

      Pre-requisites: A programming background would be advantageous especially knowledge of SQL

      Course Curriculum

      Part 1: InfoSphere BigInsights Basics

      1. Customer Video on Business Transformation

      2. Introduction to InfoSphere Big Insight : Classroom Session

      3. IBM Case Study and Customer Video

      4. Introduction to Big Insights Analytics for Business Analysts : Classroom Session

      5. Importing Data to InfoSphere BigInsights : Classroom Session

      6. BigSheets Workflow : Classroom Session

      7. Big Sheets Collections : Classroom Session

      8. Lab Exercise 1 : Creating BigSheets Collection by uploading a file

      9. Lab Exercise 2: Creating Collections by using Applications to gather Data

      9. Customer Video on Business Transformation

      Part 2: Working with InfoSphere BigInsights

      10. Big Sheets Navigation : Clasroom Session

      11. Working with Big Sheets Collections : Classroom Session

      12. Big Sheets Readers & Extensions : Classroom Session

      13. Lab Exercise 3: Analyzing a Big Sheets Collection

      14. Lab Exercise 4: Combining Data to create a new Collection

      15. Lab Exercise 5: Visualizing Data in Graphical Form & Exporting Data from a Big

      Sheets Collection

      Lab Exercise 6: Installing BigSheets Plug in

      For Admissions:

      Please send your Resume to admission@aegis.edu.in 
      Get in Touch with Ritin Joshi at +91 9022137010 or Sachin Khare +91 9819008153

      IBM Business Analytics lab

      Spark, an alternative for fast data analytics
      Although Hadoop captures the most attention for distributed data analytics, there are alternatives that provide some interesting advantages to the typical Hadoop platform. Spark is a scalable data analytics platform that incorporates primitives for in-memory computing and therefore exercises some performance advantages over Hadoop's cluster storage approach. Spark is implemented in and exploits the Scala language, which provides a unique environment for data processing. Get to know the Spark approach for cluster computing and its differences from Hadoop.

      Spark is an open source cluster computing environment similar to Hadoop, but it has some useful differences that make it superior in certain workloads—namely, Spark enables in-memory distributed datasets that optimize iterative workloads in addition to interactive queries.

      Spark is implemented in the Scala language and uses Scala as its application framework. Unlike Hadoop, Spark and Scala create a tight integration, where Scala can easily manipulate distributed datasets as locally collective objects.

      Although Spark was created to support iterative jobs on distributed datasets, it's actually complementary to Hadoop and can run side by side over the Hadoop file system. This behavior is supported through a third-party clustering framework called Mesos. Spark was developed at the University of California, Berkeley, Algorithms, Machines, and People Lab to build large-scale and low-latency data analytics applications.


      Spark cluster computing architecture
      Although Spark has similarities to Hadoop, it represents a new cluster computing framework with useful differences. First, Spark was designed for a specific type of workload in cluster computing—namely, those that reuse a working set of data across parallel operations (such as machine learning algorithms). To optimize for these types of workloads, Spark introduces the concept of in-memory cluster computing, where datasets can be cached in memory to reduce their latency of access.

      Spark also introduces an abstraction called resilient distributed datasets (RDDs). An RDD is a read-only collection of objects distributed across a set of nodes. These collections are resilient, because they can be rebuilt if a portion of the dataset is lost. The process of rebuilding a portion of the dataset relies on a fault-tolerance mechanism that maintains lineage (or information that allows the portion of the dataset to be re-created based on the process from which the data was derived). An RDD is represented as a Scala object and can be created from a file; as a parallelized slice (spread across nodes); as a transformation of another RDD; and finally through changing the persistence of an existing RDD, such as requesting that it be cached in memory.

      Applications in Spark are called drivers, and these drivers implement the operations performed either on a single node or in parallel across a set of nodes. Like Hadoop, Spark supports a single-node cluster or a multi-node cluster. For multi-node operation, Spark relies on the Mesos cluster manager. Mesos provides an efficient platform for resource sharing and isolation for distributed applications (see Figure 1). This setup allows Spark to coexist with Hadoop in a single shared pool of nodes.
      Figure 1. Spark relies on the Mesos cluster manager for resource sharing and isolation.

      Image showing the relationship among Mesos and Spark for resource sharing and isolation

      Spark programming model
      A driver can perform two types of operations on a dataset: an action and a transformation. An action performs a computation on a dataset and returns a value to the driver; a transformation creates a new dataset from an existing dataset. Examples of actions include performing a Reduce operation (using a function) and iterating a dataset (running a function on each element, similar to the Map operation). Examples of transformations include the Map operation and the Cache operation (which requests that the new dataset be stored in memory).

      Course Content:

      • Introduction to Data Analysis with Spark
      • L1. Installation and running Spark.
        1. What is Apache Spark?
        2. Downloading and installation Spark.
        3. The first step to a Spark program in Python, Scala and Java.
      • L2. RDD operations.
        1. Initializing a SparkContext.
        2. Loading and saving data into an RDD.
        3. RDD Basics.
        4. Passing Functions to Spark.
        5. Common Transformations and Actions.
        6. Working with Key-Value Pairs.
      • Laboratory exercise 1.
      • Basic operations with RDD.
      • Spark SQL and Spark Streaming
      • L3. Programmatically Specifying the Schema.
        1. Initializing Spark SQL.
        2. Basic Query Example.
        3. SchemaRDDs.
      • L4. Basic Concepts of Spark Streaming.
        1. Initializing StreamingContext.
        2. Stateless and stateful transformations.
        3. Core Sources.
        4. Additional Sources.
      • Laboratory exercise 2.
      • SQL operations and log files analysis.
      • Machine Learning with MLlib
      • L5. Obtaining, Processing, and Preparing Data with Spark.
        1. Data Types.
        2. Exploring and visualizing data.
        3. Processing and transforming data.
        4. Extracting useful features from data.
        5. Basic statistics.
      • L6. Classification and regression with Spark MLlib.
        1. Regression models and Classifiers.
        2. Naive Bayes.
        3. Advanced Text Processing.
      • Laboratory exercise 3.
      • Texts classification.
      • Spark Graph Processing
      • L7. The Property Graph.
        1. Getting Started.
        2. Example Property Graph.
        3. Graph Operators.
      • L8. Graph Algorithms.
        1. PageRank.
        2. Connected Components.
        3. Triangle Counting.
      • Laboratory exercise 4.
      • Social Network Graph building.

      Cognos TM1

      Machine Learning and Data Mining

      This Courses offers a thorough grounding in machine learning concepts as well as practical advice on applying machine learning tools and techniques in real-world data mining situations. This will teach you everything you need to know about preparing inputs, interpreting outputs, evaluating results, and the algorithmic methods at the heart of successful data mining.

      Course Content/Syllabus of Machine Learning and Data Mining

      • Introduction
        • Machine Learning Basics
        • Data Mining Basics
        • Supervised and Unsupervised Learning
      • Regression Analysis
        • Linear Regression
        • Logistic Regression
      • Regularization
        • Bias
        • Variance
        • Over-fitting
      • Classifiers
        • Decision Trees & Random Forest
        • k-Nearest Neighbor
        • Support Vector Machines
      • Clustering
        • K-Means Clustering
      • Applying Machine Learning
        • Model Selection
        • Train/Test/Validation sets
        • Learning curves
      • Machine Learning Model Design
        • Error Analysis
        • Error Metric
        • Precision and Recall
      • Dimensionality Reduction
        • Principal Component Analysis

      Hadoop 

      What is Hadoop? 

      Apache™ Hadoop® is an open source software project that enables distributed processing of large data sets across clusters of commodity servers. It is designed to scale up from a single server to thousands of machines, with very high degree of fault tolerance. Rather than relying on high-end hardware, the resiliency of these clusters comes from the software's ability to detect and handle failures at the application layer.

      It is a data storage processing system that enables data storage, file sharing, data analytics etc. The technology is scalable & enables effective analysis from large unstructured data therefore adding value. With increasing role of social media and internet communication Hadoop is being largely used by various spectrum of companies ranging from Facebook to Yahoo. Other big users of Hadoop include Cloudera, Hortonworks, IBM, Amazon, Intel, Mapr, Microsoft. This technology facilitates its users to handle more data through enhanced storage capacity also enables data retrieval in case of hardware failure.

      Hadoop as a solution is increasingly offering data retrieval and data security features. These features are getting better with time. This is leading to enhanced solution to database management systems (DBMS). Hadoop software is the highest growing market in comparison to hardware and services.

      Who is Using Hadoop? 

      With increasing role of social media and internet communication Hadoop is being largely used by various spectrum of companies ranging from Facebook to Yahoo. This technology facilitates its users to handle more data through enhanced storage capacity also enables data retrieval in case of hardware failure.

      • Yahoo ( One of the biggest user & more than 80% code contributor to Hadoop)
      • Facebook
      • Cloudera
      • Hortonworks
      • IBM
      • Intel
      • Mapr
      • Microsoft
      • Netflix 
      • Amazon
      • Adobe 
      • Ebay
      • Hulu
      • Twitter
      • Snapdeal
      • TataSky

      Why use Hadoop?

      Hadoop changes the economics and the dynamics of large-scale computing. Its impact can be boiled down to four salient characteristics. Hadoop enables a computing solution that is:

      Scalable

       A cluster can be expanded by adding new servers or resources without having to move, reformat, or change the dependent analytic workflows or applications.

      Cost effective

       Hadoop brings massively parallel computing to commodity servers. The result is a sizeable decrease in the cost per terabyte of storage, which in turn makes it affordable to model all your data.

      Flexible

      Hadoop is schema-less and can absorb any type of data, structured or not, from any number of sources. Data from multiple sources can be joined and aggregated in arbitrary ways enabling deeper analyses than any one system can provide.

      Fault tolerant

       When you lose a node, the system redirects work to another location of the data and continues processing without missing a beat.

      Hadoop architecture

      Hadoop is composed of four core components—Hadoop Common, Hadoop Distributed File System (HDFS), MapReduce and YARN.

      Hadoop for the Enterprise from IBM

      IBM BigInsights brings the power of Hadoop to the enterprise, enhancing Hadoop by adding administrative, discovery, development, provisioning, security and support, along with best-in-class analytical capabilities. IBM® BigInsights™ for Apache™ Hadoop® is a complete solution for large-scale analytics. Explore the Big Data Anlytics using IBM Infosphre Big Insight taught by IBM Experts at Aegis. 

      Sample Jobs in Hadoop

      • Hadoop Developer
      • Hadoop Architect
      • Hadoop Tester
      • Hadoop Administrator
      • Data Scientist

      Companies Recruiting

      • IBM
      • Myntra
      • Snapdeal
      • HP
      • EMC
      • Cloudera

      Hadoop Course Overview

      Through lectures, hands-on exercises, case studies, and projects the students will explore the Hadoop ecosystem, learning topics such as:

      • What is Hadoop and the real-world problems it solves
      • Understand MapReduce concepts and how it works
      • Write MapReduce programs
      • Architect Hadoop-based applications
      • Understand Hadoop operations

      Prerequisites and Requirements

      • Programming proficiency in Java or Python is required. Prior knowledge of Apache Hadoop is not required.
      • Primary language during lectures will be Java. However, assignments and projects completed in both Java and Python will be accepted.

      Note: For students who do not have a programming background in Java or Python, additional readings or learning videos will be prescribed. The programming prerequisites will need to be completed within the first 2 weeks of the course.

      Course Contents

      1. Introduction to Hadoop: Real-World Hadoop Applications and Use Cases, Hadoop Ecosystem & projects, Types of Hadoop Processing, Hadoop Distributions, Hadoop Installation.
      2. Hadoop MapReduce: Developing a MapReduce Application, MapReduce Internals, MapReduce I/O, Examples illustrating MapReduce Features.
      3. Hadoop Application Architectures: Data Modeling, Data Ingestion, and Data Processing in Hadoop.
      4. Hadoop Projects: Practical Tips for Hadoop Projects.
      5. Hadoop Operations: Planning a Hadoop Cluster, Installation and Configuration, Security, and Monitoring.
      6. Hadoop Case Studies: A series of Group Discussions and Student Project Presentations.

      References

      1. Lecture notes will form the primary reference material for the course.
      2. Specific reading material, papers, and videos will be assigned during the course to augment the lecture notes.
      3. Reference Textbook: Hadoop: The Definitive Guide, 4th Edition, Tom White, O’Reilly Media, Inc.

      Evaluation

      1. Homework Assignments: Students will be assigned 5 homework assignments during the course: 10%
      2. Quizzes: Best 2 out 3 unannounced quizzes will be counted towards the final grade: 10%
      3. Mid-Term Exam: 20%
      4. Final Exam: 40%
      5. Student Project: The students will individually, or optionally in teams of two, research, design, implement, and present a substantial term project: 20%

        Consult our Big Data Career Advisers +91 704 531 4371 /+91 981 900 8153 on how to add wings to your career

       

      Introduction to Data Science

      There is much debate among scholars and practitioners about what data science is, and what it isn’t. Does it deal only with big data? What constitutes big data? Is data science really that new? How is it different from statistics and analytics?

      One way to consider data science is as an evolutionary step in interdisciplinary fields like business analysis that incorporate computer science, modeling, statistics, analytics, and mathematics. Data science is the study of the generalizable extraction of knowledge from data, yet the key word is science. It incorporates varying elements and builds on techniques and theories from many fields, including signal processing, mathematics, probability models, machine learning, statistical learning, computer programming, data engineering, pattern recognition and learning, visualization, uncertainty modeling, data warehousing, and high performance computing with the goal of extracting meaning from data and creating data products.

      From government, social networks and ecommerce sites to sensors, smart meters and mobile networks, data is being collected at an unprecedented speed and scale. Data science can put big data to use.

      Factoid_Likes

      Average number of “likes” and “comments” posted on facebook daily.

      Factoid_Percent

      Percentage of the world’s data that has been produced in the last two years.

      Factoid_eComm

      Projected volume of e-commerce transactions in 2016.

      Data Science is not restricted to only big data, although the fact that data is scaling up makes big data an important aspect of data science.

      Course Coverage:

      • What is data science?
      • Evolution
      • Types of Analytics
      • Role of Data Scientist
      • Use Cases

      PGP_Sep 2015_Fundamentals Of Coding

      EPGP_Industry Session

      About Technology

      IBM Cognos Business Intelligence turns data into past, present and future views of your organization’s operations and performance so your decision makers can capitalize on opportunities and minimize risks. You can use these views to understand the immediate and downstream effects of decisions that span potentially complex interrelated factors. Consistent snapshots of business performance are provided in enterprise-class reports and independently assembled dashboards based on trusted information. As a result, non- technical and technical business intelligence (BI) users and IT alike can respond quickly to rapidly changing business needs. 

      Cognos is a developer of Business Intelligence software developed by IBM. This training is designed for the Business users who have no technical Knowledge to extract corporate data, analyze it and assemble reports. Cognos10.1 is a combination of nearly three dozen software products.
      The main objective of Cognos Business Intelligence is to bring the right information at the right time. Usually, the data is pulled from several sources and transformed into accurate and consistent information which is stored in the Data Warehouse. Now Cognos BI is one of the largest suppliers of Business Intelligence Tools globally and the name is recognized as a leader in their field of Business Intelligence reporting studio

      PGP_Sept 2015_Big Data Use Cases

      Use cases of Big Data and Data Science

      Industry Lectures

      Advanced ML

      Pricing & Revenue Optimization

      Analytics can be an extremely powerful tool for measuring and maximizing the value of your business. Web analytics is the measurement, collection, analysis and reporting of web data for purposes of understanding and optimizing web usage. In this course shows you what analytics is, what it can do, and how it will change your online business. Learn how to apply value measurement to your website and figure out which pages convince people to become customers and which pages push them away. Matt also shows how to integrate and measure your SEO and social media campaigns so that you can measure the full benefit of each. 

      • Introduction
      • Navigating Google Analytics
      • Traffic Sources
      • Content
      • Visitors
      • Goals & Ecommerce
      • Actionable Insights and the Big Picture
      • Web analytics tools
      • Making better decisions
      • Summing up
      • Common mistakes analysts make
      • Social media analytics
      • Social CRM & Analytics

      Tableau is the powerful data visualization program from Tableau Software. This softwares introduces key skills such as creating visualizations, combining data from multiple sources into a single view, managing Tableau workbooks, creating different types of visualizations, and summarizing data using maps and dashboards. It gives deeper understanding of your data with Tableau. This course helps you understand the important concepts and techniques used in Tableau to move from simple to complex visualizations and learn how to combine them in interactive dashboards. Understand the Tableau interface / paradigm – components, shelves, data elements, and Terminology.

      Topics included:

      • Installing and running Tableau
      • Connecting to a data source
      • Joining related data sources
      • Visual Analytics
      • Displaying the data underlying a workbook
      • Adding, duplicating, and renaming worksheets
      • Creating a packaged workbook
      • Mapping
      • Calculations
      • Dashboards and Stories
      • Tableau Server
      • Tableau Online

      Machine Learning and Data Mining

      This Courses offers a thorough grounding in machine learning concepts as well as practical advice on applying machine learning tools and techniques in real-world data mining situations. This will teach you everything you need to know about preparing inputs, interpreting outputs, evaluating results, and the algorithmic methods at the heart of successful data mining.

      Course Content/Syllabus of Machine Learning and Data Mining

      • Introduction
        • Machine Learning Basics
        • Data Mining Basics
        • Supervised and Unsupervised Learning
      • Regression Analysis
        • Linear Regression
        • Logistic Regression
      • Regularization
        • Bias
        • Variance
        • Over-fitting
      • Classifiers
        • Decision Trees & Random Forest
        • k-Nearest Neighbor
        • Support Vector Machines
      • Clustering
        • K-Means Clustering
      • Applying Machine Learning
        • Model Selection
        • Train/Test/Validation sets
        • Learning curves
      • Machine Learning Model Design
        • Error Analysis
        • Error Metric
        • Precision and Recall
      • Dimensionality Reduction
        • Principal Component Analysis

      Hadoop 

      What is Hadoop? 

      Apache™ Hadoop® is an open source software project that enables distributed processing of large data sets across clusters of commodity servers. It is designed to scale up from a single server to thousands of machines, with very high degree of fault tolerance. Rather than relying on high-end hardware, the resiliency of these clusters comes from the software's ability to detect and handle failures at the application layer.

      It is a data storage processing system that enables data storage, file sharing, data analytics etc. The technology is scalable & enables effective analysis from large unstructured data therefore adding value. With increasing role of social media and internet communication Hadoop is being largely used by various spectrum of companies ranging from Facebook to Yahoo. Other big users of Hadoop include Cloudera, Hortonworks, IBM, Amazon, Intel, Mapr, Microsoft. This technology facilitates its users to handle more data through enhanced storage capacity also enables data retrieval in case of hardware failure.

      Hadoop as a solution is increasingly offering data retrieval and data security features. These features are getting better with time. This is leading to enhanced solution to database management systems (DBMS). Hadoop software is the highest growing market in comparison to hardware and services.

      Who is Using Hadoop? 

      With increasing role of social media and internet communication Hadoop is being largely used by various spectrum of companies ranging from Facebook to Yahoo. This technology facilitates its users to handle more data through enhanced storage capacity also enables data retrieval in case of hardware failure.

      • Yahoo ( One of the biggest user & more than 80% code contributor to Hadoop)
      • Facebook
      • Cloudera
      • Hortonworks
      • IBM
      • Intel
      • Mapr
      • Microsoft
      • Netflix 
      • Amazon
      • Adobe 
      • Ebay
      • Hulu
      • Twitter
      • Snapdeal
      • TataSky

      Why use Hadoop?

      Hadoop changes the economics and the dynamics of large-scale computing. Its impact can be boiled down to four salient characteristics. Hadoop enables a computing solution that is:

      Scalable

       A cluster can be expanded by adding new servers or resources without having to move, reformat, or change the dependent analytic workflows or applications.

      Cost effective

       Hadoop brings massively parallel computing to commodity servers. The result is a sizeable decrease in the cost per terabyte of storage, which in turn makes it affordable to model all your data.

      Flexible

      Hadoop is schema-less and can absorb any type of data, structured or not, from any number of sources. Data from multiple sources can be joined and aggregated in arbitrary ways enabling deeper analyses than any one system can provide.

      Fault tolerant

       When you lose a node, the system redirects work to another location of the data and continues processing without missing a beat.

      Hadoop architecture

      Hadoop is composed of four core components—Hadoop Common, Hadoop Distributed File System (HDFS), MapReduce and YARN.

      Hadoop for the Enterprise from IBM

      IBM BigInsights brings the power of Hadoop to the enterprise, enhancing Hadoop by adding administrative, discovery, development, provisioning, security and support, along with best-in-class analytical capabilities. IBM® BigInsights™ for Apache™ Hadoop® is a complete solution for large-scale analytics. Explore the Big Data Anlytics using IBM Infosphre Big Insight taught by IBM Experts at Aegis. 

      Sample Jobs in Hadoop

      • Hadoop Developer
      • Hadoop Architect
      • Hadoop Tester
      • Hadoop Administrator
      • Data Scientist

      Companies Recruiting

      • IBM
      • Myntra
      • Snapdeal
      • HP
      • EMC
      • Cloudera

      Hadoop Course Overview

      Through lectures, hands-on exercises, case studies, and projects the students will explore the Hadoop ecosystem, learning topics such as:

      • What is Hadoop and the real-world problems it solves
      • Understand MapReduce concepts and how it works
      • Write MapReduce programs
      • Architect Hadoop-based applications
      • Understand Hadoop operations

      Prerequisites and Requirements

      • Programming proficiency in Java or Python is required. Prior knowledge of Apache Hadoop is not required.
      • Primary language during lectures will be Java. However, assignments and projects completed in both Java and Python will be accepted.

      Note: For students who do not have a programming background in Java or Python, additional readings or learning videos will be prescribed. The programming prerequisites will need to be completed within the first 2 weeks of the course.

      Course Contents

      1. Introduction to Hadoop: Real-World Hadoop Applications and Use Cases, Hadoop Ecosystem & projects, Types of Hadoop Processing, Hadoop Distributions, Hadoop Installation.
      2. Hadoop MapReduce: Developing a MapReduce Application, MapReduce Internals, MapReduce I/O, Examples illustrating MapReduce Features.
      3. Hadoop Application Architectures: Data Modeling, Data Ingestion, and Data Processing in Hadoop.
      4. Hadoop Projects: Practical Tips for Hadoop Projects.
      5. Hadoop Operations: Planning a Hadoop Cluster, Installation and Configuration, Security, and Monitoring.
      6. Hadoop Case Studies: A series of Group Discussions and Student Project Presentations.

      References

      1. Lecture notes will form the primary reference material for the course.
      2. Specific reading material, papers, and videos will be assigned during the course to augment the lecture notes.
      3. Reference Textbook: Hadoop: The Definitive Guide, 4th Edition, Tom White, O’Reilly Media, Inc.

      Evaluation

      1. Homework Assignments: Students will be assigned 5 homework assignments during the course: 10%
      2. Quizzes: Best 2 out 3 unannounced quizzes will be counted towards the final grade: 10%
      3. Mid-Term Exam: 20%
      4. Final Exam: 40%
      5. Student Project: The students will individually, or optionally in teams of two, research, design, implement, and present a substantial term project: 20%

        Consult our Big Data Career Advisers +91 704 531 4371 /+91 981 900 8153 on how to add wings to your career

      Apache Pig

      Apache Pig is a platform for analyzing large data sets that consists of a high-level language for expressing data analysis programs, coupled with infrastructure for evaluating these programs. The salient property of Pig programs is that their structure is amenable to substantial parallelization, which in turns enables them to handle very large data sets.

      At the present time, Pig's infrastructure layer consists of a compiler that produces sequences of Map-Reduce programs, for which large-scale parallel implementations already exist (e.g., the Hadoop subproject). Pig's language layer currently consists of a textual language called Pig Latin, which has the following key properties:

      • Ease of programming. It is trivial to achieve parallel execution of simple, "embarrassingly parallel" data analysis tasks. Complex tasks comprised of multiple interrelated data transformations are explicitly encoded as data flow sequences, making them easy to write, understand, and maintain.
      • Optimization opportunities. The way in which tasks are encoded permits the system to optimize their execution automatically, allowing the user to focus on semantics rather than efficiency.
      • Extensibility. Users can create their own functions to do special-purpose processing.

      Apache Flume

      Flume is a distributed, reliable, and available service for efficiently collecting, aggregating, and moving large amounts of log data. It has a simple and flexible architecture based on streaming data flows. It is robust and fault tolerant with tunable reliability mechanisms and many failover and recovery mechanisms. It uses a simple extensible data model that allows for online analytic application.

      Agent component diagram

      Fundamentals of Microsoft Azure

      Microsoft Azure is a cloud computing platform and infrastructure, created by Microsoft, for building, deploying and managing applications and services through a global network of Microsoft-managed datacenters. It provides both PaaS and IaaS services and supports many different programming languages, tools and frameworks, including both Microsoft-specific and third-party software and systems.

      Course Content:

      • Introduction
      • Why Microsoft Azure?
      • The Scope of Microsoft Azure Services
      • Tools and APIs You Use to Access Azure
      • Navigating the Azure Portal and the Preview Portal
      • Creating an Organizational Account and Subscription
      • Understanding Subscriptions and Directories
      • How Do I: Manage Directories
      • How Do I: Manager Users, Subscriptions Roles, and Directory Roles
      • How Do I: Manage Subscriptions and Service Administrators
      • Understanding Geos, Regions, and Datacenters

      Architecting on AWS

      Architecting on AWS covers the fundamentals of AWS. It is designed to teach Solution Architects how to optimize the use of the AWS Cloud by understanding AWS services and how these services fit into a cloud solution. Your architectural solution may differ depending on industry and size of business. Because there is no one-size-fits-all design, this course highlights some AWS Cloud design patterns to help you learn how a service may fit in the overall cloud design. It also covers the best practices and lessons learned.

      Course Objectives:

      • Make architectural decisions based on the AWS recommended architectural principles and best practices
      • Demonstrate basic knowledge of security best practices when using AWS
      • Create a cloud migration roadmap and a plan
      • Leverage AWS services to make your servers scalable
      • Create a business continuity plan and achieve High Availability

      Course Outline:

      • Leveraging Global Infrastructure
      • Extending On-Premises into the Cloud
      • Computing in the Cloud
      • Designing Storage Subsystems
      • Distributed Environments
      • Choosing a Datastore
      • Designing Web-Scale Media Hosting
      • Event Driven Scaling
      • Infrastructure as Code
      • Orchestrating Batch Processing
      • Reviewing Large Scale Design Patterns
      • Designing for Cost
      • Planning for High Availability and Disaster Recovery

      AWS Business Essentials

      AWS Business Essentials helps IT business leaders and professionals understand the benefits of cloud computing and how a cloud strategy can help you meet your business objectives. In this course we discuss the advantages of cloud computing for your business and the fundamentals of AWS, including financial benefits. This course also introduces you to successful cloud adoption frameworks so to help you consider the AWS platform within your cloud computing strategy.

      Course Objectives:

      • Identify the value and benefits of the AWS cloud
      • Recognize the valuable ways that the AWS platform can be used
      • Understand the robust security capabilities, controls, and assurances in place to maintain security and data protection
      • Articulate the financial impact the AWS cloud can have on an organization’s procurement cycle, cost management, and contracts, while minimizing risks associated with consumption-based pricing models

      Course Content:

      • Benefits of Cloud Computing and Defining Your Cloud Strategy
      • Introduction to the AWS Cloud
      • Security and Compliance
      • Cloud Financials
      • Migrating to the Cloud: Next Steps

      Analytics for Hadoop on Bluemix

      With the advent of IBM Bluemix, it has never been easier to start playing with Hadoop – Specifically IBM’s Analytics for Hadoop. The IBM Analytics for Hadoop Bluemix service allows developers to leverage the power of IBM’s Hadoop offering, BigInsights to power the latest mobile, web and cloud applications. It also provides an opportunity for enterprise IT administrators who are looking to adopt Hadoop, to explore the rich enterprise features which BigInsights provides.

      Course Content

      • Introduction to IBM Blue Mix
      • Introduction Hadoop
      • Signing into Bluemix and starting Analytics for Hadoop
      • Navigating the BigInsights console
      • Loading data into BigInsights
      • Exploring data with BigSheets
      • Live project

      Big Data on AWS: Amazon Elastic MapReduce (EMR)

      Description

      This course introduces you to cloud-based big data solutions and Amazon Elastic MapReduce (EMR), the AWS big data platform. In this course, we show you how to use Amazon EMR to process data using the broad ecosystem of Hadoop tools like Pig and Hive. We also teach you how to create big data environments, work with Amazon DynamoDB, Amazon Redshift, and Amazon Kinesis, and leverage best practices to design big data environments for security and cost-effectiveness.

      Course Objectives

      This course is designed to teach you how to:

      • Understand Apache Hadoop in the context of Amazon EMR
      • Understand the architecture of an Amazon EMR cluster
      • Launch an Amazon EMR cluster using an appropriate Amazon Machine Image and Amazon EC2 instance types
      • Choose appropriate AWS data storage options for use with Amazon EMR
      • Know your options for ingesting, transferring, and compressing data for use with Amazon EMR
      • Use common programming frameworks available for Amazon EMR including Hive, Pig, and Streaming
      • Work with Amazon Redshift to implement a big data solution
      • Leverage big data visualization software
      • Choose appropriate security options for Amazon EMR and your data
      • Perform in-memory data analysis with Spark and Shark on Amazon EMR
      • Choose appropriate options to manage your Amazon EMR environment cost-effectively
      • Understand the benefits of using Amazon Kinesis for big data

      Intended Audience

      This course is intended for:

      • Individuals responsible for designing and implementing big data solutions, namely Solutions Architects and SysOps Administrators
      • Data Scientists and Data Analysts interested in learning about big data solutions on AWS

      Prerequisites

      We recommend that attendees of this course have:

      • Basic familiarity with big data technologies, including Apache Hadoop and HDFS
        • Knowledge of big data technologies such as Pig, Hive, and MapReduce is helpful but not required
      • Working knowledge of core AWS services and public cloud implementation
      • Basic understanding of data warehousing, relational database systems, and database design

      Hands-On Activity

      This course allows you to test new skills and apply knowledge to your working environment through a variety of practical exercises

      Course Outline

      Module 1

      • Overview of Big Data, Apache Hadoop, and the Benefits of Amazon EMR
      • Amazon EMR Architecture
      • Using Amazon EMR
      • Launching and Using an Amazon EMR Cluster
      • Hadoop Programming Frameworks

      Module 2

      • Using Hive for Advertising Analytics
      • Using Streaming for Life Sciences Analytics
      • Overview: Spark and Shark for In-Memory Analytics
      • Using Spark and Shark for In-Memory Analytics
      • Managing Amazon EMR Costs
      • Overview of Amazon EMR Security
      • Data Ingestion, Transfer, and Compression
      • Using Amazon Kinesis for Real-Time Big Data Processing


      Module 3

      • Using Amazon Kinesis and Amazon EMR to Stream and Process Big Data
      • AWS Data Storage Options
      • Using DynamoDB with Amazon EMR
      • Overview: Amazon Redshift and Big Data
      • Using Amazon Redshift for Big Data
      • Visualizing and Orchestrating Big Data
      • Using Tableau Desktop or Jaspersoft BI to Visualize Big Data

      Systems Operations on AWS

      Description

      System Operations on AWS is designed to teach those in a Systems Administrator or Developer Operations (DevOps) role how to create automatable and repeatable deployments of networks and systems on the AWS platform. The course covers the specific AWS features and tools related to configuration and deployment, as well as common techniques used throughout the industry for configuring and deploying systems.

      Course Objectives

      This course is designed to teach you how to:

      • Use standard AWS infrastructure features such as Amazon Virtual Private Cloud (VPC), Amazon Elastic Compute Cloud (EC2), Elastic Load Balancing, and Auto Scaling from the command line
      • Use AWS Cloud Formation and other automation technologies to produce stacks of AWS resources that can be deployed in an automated, repeatable fashion
      • Build functioning virtual private networks with Amazon VPC from the ground up using the AWS Management Console
      • Deploy Amazon EC2 instances using command line calls and troubleshoot the most common problems with instances
      • Monitor the health of Amazon EC2 instances and other AWS services
      • Manage user identity, AWS permissions, and security in the cloud
      • Manage resource consumption in an AWS account using tools such as Amazon CloudWatch, tagging, and Trusted Advisor
      • Select and implement the best strategy for creating reusable Amazon EC2 instances
      • Configure a set of Amazon EC2 instances that launch behind a load balancer, with the system scaling up and down in response to demand
      • Edit and troubleshoot a basic AWS Cloud Formation stack definition

      Intended Audience

      This course is intended for:

      • System Administrators
      • Software Developers, especially those in a Developer Operations (DevOps) role

      Prerequisites

      We recommend that attendees of this course have the following prerequisites:

      • Attended the AWS Technical Fundamentals course
      • Background in either software development or systems administration
      • Some experience with maintaining operating systems at the command line (shell scripting in Linux environments, cmd or PowerShell in Windows)
      • Basic knowledge of networking protocols (TCP/IP, HTTP)

      Course Outline

      Module 1

      • System Operations on AWS Overview
      • Networking in the Cloud
      • Computing in the Cloud

      Module 2

      • Storage and Archiving in the Cloud
      • Monitoring in the Cloud
      • Managing Resource Consumption in the Cloud

      Module 3

      • Configuration Management in the Cloud
      • Creating Scalable Deployments in the Cloud
      • Creating Automated and Repeatable Deployments

      Amazon Web Services (AWS) Technical Fundamentals

      Description

      AWS Technical Fundamentals introduces you to AWS products, services, and common solutions. It provides IT technical end users with basic fundamentals to become more proficient in identifying AWS services so that you can make informed decisions about IT solutions based on your business requirements.

      Course Objectives

      This course is designed to teach you how to:

      • Recognize terminology and concepts as they relate to the AWS platform
      • Navigate the AWS Management Console
      • Understand the security measures AWS provides
      • Differentiate AWS Storage options and create an Amazon Simple Storage Service (S3) bucket
      • Recognize AWS Compute and Networking options and use Amazon Elastic Compute Cloud (EC2) and Amazon Elastic Block Storage (EBS)
      • Describe Managed Services and Database options
      • Use Amazon Relational Database Service (RDS) to launch an application
      • Identify Deployment and Management options

      Intended Audience

      This course is intended for:

      • Individuals responsible for articulating the technical benefits of AWS services
      • Any individual who is new to AWS
      • SysOps Administrators and Developers who are interested in using AWS services

      Hands-On Activity

      This course includes activities that will allow you to test new skills and apply knowledge through hands-on lab activities.

      Course Outline

      Module 1: Introduction & History to AWS

      • Navigate the AWS Management Console
      • Recognize AWS Global Infrastructure
      • Describe the security measures AWS provides

      Module 2: AWS Storage & Content Delivery

      • Identify key AWS storage options
      • Describe Amazon EBS
      • Create an Amazon S3 bucket and manage associated objects

      Module 3: Compute Services & Networking

      • Identify the different AWS compute and networking options
      • Describe an Amazon Virtual Private Cloud (VPC)
      • Create an Amazon EC2 instance
      • Verify how to use Amazon EBS

      Module 4: AWS Managed Services & Database

      • Describe Amazon DynamoDB
      • Verify key aspects of Amazon RDS
      • Execute an Amazon RDS drive application

      Module 5: Deployment and Management

      • Identify AWS CloudFormation
      • Describe Amazon CloudWatch metrics and alarms
      • Describe Amazon Identity and Access Management (IAM)

      Python is a powerful, flexible, open-source language that is easy to learn, easy to use, and has powerful libraries for data manipulation and analysis. This is among the top three along with R and SAS for analytics application. With origination as an open source scripting language, Python usage has grown over time. Today, it sports libraries (numpy, scipy and matplotlib) and functions for almost any statistical operation / model building you may want to do. Since introduction of pandas, it has become very strong in operations on structured data. Its simple syntax is very accessible to programming novices, and will look familiar to anyone with experience in Matlab, C/C++, Java, or Visual Basic. Python has a unique combination of being both a capable general-purpose programming language as well as being easy to use for analytical and quantitative computing.

      For over a decade, Python has been used in scientific computing and highly quantitative domains such as finance, oil and gas, physics, and signal processing. It has been used to improve Space Shuttle mission design, process images from the Hubble Space Telescope, and was instrumental in orchestrating the physics experiments which led to the discovery of the Higgs Boson (the so-called "God particle"). At the same time, Python has been used to build massively scalable web applications like Youtube, and has powered much of Google's internal infrastructure. Python is easy for analysts to learn and use, but powerful enough to tackle even the most difficult problems in virtually any domain. It integrates well with existing IT infrastructure, and is very platform independent. Among modern languages, its agility and the productivity of Python-based solutions is legendary. Companies of all sizes and in all areas - from the biggest investment banks to the smallest social/mobile web app startups - are using Python to run their business and manage their data.

      The course on the Python programming language and standard library with a focus on applying Python to problems in scripting, data analysis, and systems programming. This course assumes no prior experience with Python, but does assume participants know how to program in another programming language. It is especially well-suited for programmers who want to know what Python is all about without extra fluff.

      Course Content:

      • Strings and Lists
      • Loops
      • Functions
      • Tuples, Dictionaries and Files
      • Regular Expressions
      • NLTK

      OpenStack

      This course assists administrators and users to configure, manage, and use the OpenStack cloud services platform. An architectural overview ensures understanding of various OpenStack projects and their functions. Hands-on labs provide configuration and operation experience with major aspects of the OpenStack environment.

      Course Content:
      • OpenStack Block Storage
      • OpenStack Metering
      • OpenStack Identity
      • OpenStack Orchestration
      • OpenStack Dashboard
      • OpenStack Networking
      • OpenStack Image Service
      • OpenStack Object Storage
      • OpenStack Compute

      Big Data Events

      Statistics and Probability for Data Analysis

      Decision making process under uncertainty is largely based on application of statistical data analysis for probabilistic risk assessment of your decision. Managers need to understand variation for two key reasons. First, so that they can lead others to apply statistical thinking in day to day activities and secondly, to apply the concept for the purpose of continuous improvement. This course will provide you with hands-on experience to promote the use of statistical thinking and techniques to apply them to make educated decisions whenever there is variation in business data.

      This course exposes you to elementary statistics and probability concepts that provide the necessary foundation for application in data analysis.

      Statistics         

      •             Dataset
      •             Descriptive
      •             Percentiles
      •             Inferential
      •             Scaling
      •             Normalization
      •             Transformation
      •             Outliers Detection/Fixation
      •             Sparseness Elimination
      •             Sampling
      •             Exploratory Data Analysis
      •             ANOVA
      •             Correlation Analysis
      •             Feature Selection
      •             Uncorrelated Feature Elimination

      Probability      

      •             Probability Theory
      •             Distribution
      •             Hypothesis Testing
      •             Significant (p-) value
      •             Maximum Likelihood
      •             Monte Carlo
      •             Regression Analysis
      •             Time Series

      E_BABD_Jan 15-16_Machine Learning

      Machine Learning and Data Mining

      This Courses offers a thorough grounding in machine learning concepts as well as practical advice on applying machine learning tools and techniques in real-world data mining situations. This will teach you everything you need to know about preparing inputs, interpreting outputs, evaluating results, and the algorithmic methods at the heart of successful data mining.

      Course Content/Syllabus of Machine Learning and Data Mining

      • Introduction
        • Machine Learning Basics
        • Data Mining Basics
        • Supervised and Unsupervised Learning
      • Regression Analysis
        • Linear Regression
        • Logistic Regression
      • Regularization
        • Bias
        • Variance
        • Over-fitting
      • Classifiers
        • Decision Trees & Random Forest
        • k-Nearest Neighbor
        • Support Vector Machines
      • Clustering
        • K-Means Clustering
      • Applying Machine Learning
        • Model Selection
        • Train/Test/Validation sets
        • Learning curves
      • Machine Learning Model Design
        • Error Analysis
        • Error Metric
        • Precision and Recall
      • Dimensionality Reduction
        • Principal Component Analysis

      Object Oriented Software Engineering

      To learn the advanced software engineering principles and methodologies for effective software development

      Learning Outcomes:
      On successful completion of this course, the learner will be able to

      • To learn about software prototyping, analysis and design
      • To learn UML and its usage
      • Case studies to apply the principles

      Course Contents:

      • INTRODUCTION Software Engineering Paradigms - Software Development process models - Project & Process -Project management – Process & Project metrics - Object Oriented concepts & Principles.
      • PLANNING & SCHEDULING Software prototyping - Software project planning – Scope – Resources - Software Estimation -Empirical Estimation Models-Planning-Risk Management - Software Project Scheduling – Object oriented Estimation & Scheduling.
      • ANALYSIS & DESIGN : Analysis Modeling - Data Modeling - Functional Modeling & Information Flow- Behavioral Modeling-Structured Analysis - Object Oriented Analysis - Domain Analysis-Object oriented Analysis process - Object Relationship Model - Object Behaviour Model; Design Concepts & Principles - Design Process - Design Concepts - Modular Design –Design Effective Modularity - Introduction to Software Architecture - Data Design – Transform Mapping – Transaction Mapping – OOD - Design System design process- Object design process -Design Patterns.
      • IMPLEMENTATION & TESTING: Top-Down , Bottom-Up , object oriented product Implemention & Integration. Software Testing methods-White Box, Basis Path-Control Structure –Black Box-Unit Testing- Integration testing-Validation & System testing.Testing OOA & OOD models-Object oriented testing strategies.
      • MAINTENANCE: Maintenance process-System documentation-program evolution dynamics-Maintenance costs-Maintainability measurement – Case Studies

      Converged Networks

      This course focuses on the design, development, selection, deployment and support of advanced IP based audio, video and data transmission in a converged network. The course aims at providing an in depth knowledge to the learners in the field of Multi protocol label switching and IP Multicasting. QoS details of converged network shall be dealt in a detailed manner in this course.

      Learning Outcomes:
      On successful completion of this course, the learner will be able to

      • Implement applications enabled by a multi-service convergent network
      • Explain how real-time traffic is prioritized and carried within a data network.
      • Design Multicast Networks
      • Engineer networks suitable for voice, multicast traffic and high-speed switched Internet based networks

      Course Contents:

      • Teleworker Solutions: Cable and DSL Technology, cable system components and benefits, DOCSIS ,HFC Cable Network Architecture; DSL Variants, DSL performance and distance limitations
      • Real-time Applications in a Converged Network: Review of Traditional Voice Networks; Codec / Vocoder Technologies; VoIP Transport; Real-Time Concerns; RTP/RTCP; H.323 and SIP as signaling protocols; Cloud-based VoIP & Video services
      • QoS for a Converged Network: IP QOS review of QoS; 802.1p/q; Queuing mechanisms – WFQ,CBWFQ,Low-Latency, Random Early Detection; Integrated Services; Reservation Protocol (RSVP); Differentiated Services (Diffserv); QoS issues in WANs; implementation of Diffserv QoS model
      • Multiprotocol Label Switching: MPLS Header; MPLS forwarding basics; Quality of service with MPLS TE; MPLS VPN applications; implementation of MPLS and MPLS VPN
      • IP Multicast: Multicast addressing; IGMP, IGMP snooping; Multicast routing protocols (PIM-DM, -SM, SDM); Configuration

      Data Center Networking

      This course provides an insight to the students on design guidance, configuration examples and best practices with respect to data center networking. This course also deals with current data center architectures, new technologies adopted to create modern data center architecture, and merits and demerits of the same. This course examines these new technologies and demonstrates how consolidation can be realized using a unified network approach.

      Learning Outcomes:
      On successful completion of this course, the learner will be able to

      • Critically discuss data centre networking technologies and protocols.
      • Evaluate key concepts in modern Layer 2 & Layer 3 data centre networks.
      • Research a topic related to networking technologies in modern data centers.
      • Design, build and configure complex routed and switched networks.
      • Justify the implementation of networking solutions in a virtualized environment.

      Course Contents:

      • Evolution of Data Centre Design: Design for flexibility, scalability, environmental control, electrical power, flooring, fire protection, security, network infrastructure. Energy use and greenhouse gas emissions. Requirements for modern data centers, high availability and Service Orientated Infrastructures (SOI). Modern data centre use case studies.
      • Data Centre Architectures: Network connectivity optimization evolution: Top of rack (TOR), end of rack (EOR), scale up vs scale up, solutions that reduce power and cabling. Data Centre standards; TIA/EIA-942. Structured cabling standards, fibre and copper cabling characteristics, cable management, bandwidth requirements, I/O connectivity.
      • Server architectures: stand-alone, blades, stateless, clustering, scaling, optimization, virtualization. Limitation of traditional server deployments; modern solutions. Applications; database, finance etc. Redundant Layer 2 and Layer 3 designs. Case studies.
      • Layer 2 Networks: Ethernet; IEEE 802.3ba; 40 Gbps and 100 Gbps Ethernet. IEEE 802.1D Spanning Tree Protocol (STP), RSTP, PVST, MSTP. TRILL (Transparent Interconnection of Lots of Links), RBridges, IEEE 802.1Qbg Edge Virtual Bridging, 802.1Qbh Bridge Port xtension. Fibre Channel over Ethernet (FCoE) vs Internet Small Computer System Interface (iSCSI). Data Center Bridging (DCB); priority-based flow control, congestion notification, enhanced transmission selection, Data Center Bridging Exchange (DCBX). Layer 2 Multicasting; Case studies.
      • Layer 3 & Beyond: Layer 3 Data Centre technologies, network virtualization. Protocols; IPv4, IPv6, MPLS, OSPF, IS-IS, BGP. OTV, VPLS layer 2 extension protocols. Locator Identifier Separation Protocol (LISP). Layer 3 Multicasting. Data centre application services. Data centre networking use case studies and the enabling technologies and protocols in the modern data centre.

      Managing Virtual Environments

      This course deals with management of complex virtual environments, analysis of key performance factors of virtualized systems, principal issues in troubleshooting virtual environments, evaluation of small scale virtual environment developed in the lab. This course will equip students with the in-depth knowledge and techniques used to efficiently optimize and effectively trouble-shoot virtual infrastructures

      Learning Outcomes:
      On successful completion of this course, the learner will be able to

      • Discuss and evaluate the management of complex virtual environments
      • Critically analyze key performance factors in virtualized systems.
      • Identify and formulate judgments for management requirements relating to the configuration and performance of virtual environments.
      • Identify and analyze the principal issues in troubleshooting virtual environments.
      • Performance evaluations and critical evaluations of a small scale virtual environment developed in the lab.

      Course Contents:

      • Performance Management in a Virtual Environment: Management techniques, methodology and key performance metrics used to identifying CPU, memory, network, virtual machine and application performance bottlenecks in a virtualized environment.
      • Configuration and Change Management: Configuration and change management goals and guidelines, tools and technologies in virtualized environments.
      • Secure Virtual Networking: Configuration and change management goals and guidelines, tools and technologies in virtualized environments. Virtual network security architecture, network segmentation and traffic isolation to secure a virtual network configuration.
      • Protecting the Management Environment: Server authentication, authorization, and accounting, SSL certificates, server hardening. Protecting the host system; security architecture, controlling access to storage, hardening hosts. Hardening virtual machines; Virtual machine security architecture, security parameters. Protecting the host and virtual machine systems using server authentication, authorization, and accounting techniques.
      • Troubleshooting Virtual Environments : Interpreting host, network, storage, cluster and virtual machine log files. Network troubleshooting; traffic sniffing. storage access problems, iSCSI authentication and digests. Virtual machine migration, cluster errors with shares, pools, and limits. Command line interfaces and syntax, interpreting host, network, storage, cluster, virtual machine log files and network traces.

      Enterprise Storage Systems

      This course provides a comprehensive introduction to storage technology in an increasingly complex IT environment. It builds a strong understanding of underlying storage technologies and prepares you to learn advanced concepts and technologies. The course also deals with architectures, features and benefits of Intelligent Storage Systems; networked storage technologies such as FC-SAN, NAS and IP-SAN; long-term archiving solutions, the increasingly critical area of information security and the emerging field of storage virtualization technologies.

      Learning Outcomes:

      On successful completion of this course, the learner will be able to

      • Evaluate various storage classifications and technologies.
      • Analyze storage architectures, processes, components and how they relate to virtualization.
      • Justify the implementation of aa range of storage solutions to enable business continuity.
      • Analyze storage security design, implementation, monitoring and management.

      Course Contents:

      • Storage Systems: Data Classification, Storage Evolution and Data Center infrastructure. Host components, Connectivity, Storage, and Protocols. Components of a disk drive, physical  disk and factors affecting disk drive performance. RAID level performance and availability considerations. Components and benefits of an intelligent storage system.
      • Storage Networking Technologies: Direct-Attached Storage (DAS)architecture, Storage Area Network (SAN) attributes, components, topologies, connectivity options and zoning. FC protocol stack, addressing, flow control, and classes of service. Networked Attached Storage (NAS)components, protocols, IP Storage Area Network (IP SAN) iSCSI, FCIP and FCoE architecture. Content Addressed Storage (CAS) elements, storage, and retrieval processes.
      • Virtualization: Block-level and file-level storage virtualization technology, virtual provisioning and cloud computing.
      • Business Continuity: Business Continuity measurement, terminologies, and planning.Backup designs, architecture, topologies, and technologies in SAN and NAS environments. Local and Remote replication using host and array-based replication technologies such as Synchronous and Asynchronous methods.
      • Storage Security and Management: Storage security framework and various security domains. Security implementation in SAN, NAS, and IP-SAN networking. Monitoring and Storage management activities and challenges

      Data Center Virtualization

      This course focuses on the challenges in setting up a data center. Resource monitoring using hypervisors and access control to virtual machines will be covered in depth in this course. Setting up of a virtual data center and how to manage them with software interfaces will be discussed in detail.

      Learning Outcomes:

      On successful completion of this course, the learner will be able to

      • Identify various constraints and challenges in setting up a data center
      • Demonstrate Enterprise level virtualization and access control in virtual machines
      • Perform Resource monitoring and execute backup and recovery of virtual machines.

      Course Contents:

      • Data Center Challenges: How server, desktop, network Virtualization and cloud computing reduce data centre footprint, environmental impact and power requirements by driving server consolidation; Evolution of Data Centres: The evolution of computing infrastructures and architectures from stand alone servers to rack optimized blade servers and unified computing systems (UCS).
      • Enterprise-level Virtualization: Provision, monitoring and management of a virtual datacenter and multiple enterprise-level virtual servers and virtual machines through software management interfaces; Networking and Storage in Enterprise Virtualized Environments: Connectivity to storage area and IP networks from within virtualized environments using industry standard protocols.
      • Virtual Machines & Access Control: Virtual machine deployment, modification, management. monitoring and migration methodologies.
      • Resource Monitoring: Physical and virtual machine memory, CPU management and abstraction techniques using a hypervisor. 
      • Virtual Machine Data Protection: Backup and recovery of virtual machines using data recovery techniques; Scalability: Scalability features within Enterprise virtualized environments using advanced management applications that enable clustering, distributed network switches for clustering, network and storage expansion; High Availability : Virtualization high availability and redundancy techniques.

      Cloud Security

      The course on cloud security introduces the basic concepts of security systems and cryptographic protocols, which are widely used in the design of cloud security. The issues related multi tenancy operation, virtualized infrastructure security and methods to improve virtualization security are also dealt with in this course.

      Learning Outcomes:

      On successful completion of this course, the learner will be able to

      • Compare modern security concepts as they are applied to cloud computing.
      • Assess the security of virtual systems.
      • Evaluate the security issues related to multi-tenancy.
      • Appraise compliance issues that arise from cloud computing.

      Course Contents:

      • Security Concepts: Confidentiality, privacy, integrity, authentication, non-repudiation, availability, access control, defence in depth, least privilege, how these concepts apply in the cloud, what these concepts mean and their importance in PaaS, IaaS and SaaS. e.g. User authentication in the cloud; Cryptographic Systems: Symmetric cryptography, stream ciphers, block ciphers, modes of operation, public-key cryptography, hashing, digital signatures, public-key infrastructures, key management, X.509 certificates, OpenSSL.

      • Multi-tenancy Issues: Isolation of users/VMs from each other. How the cloud provider can provide this; Virtualization System Security Issues: e.g. ESX and ESXi Security, ESX file system security, storage considerations, backup and recovery; Virtualization System
      • Vulnerabilities: Management console vulnerabilities, management server vulnerabilities, administrative VM vulnerabilities, guest VM vulnerabilities, hypervisor vulnerabilities, hypervisor escape vulnerabilities, configuration issues, malware (botnets etc).

      • Virtualization System-Specific Attacks: Guest hopping, attacks on the VM (delete the VM, attack on the control of the VM, code or file injection into the virtualized file structure), VM migration attack, hyperjacking.

      • Technologies for Virtualization-Based Security Enhancement: IBM security virtual server  protection, virtualization-based sandboxing; Storage Security: HIDPS, log management,  Data Loss Prevention. Location of the Perimeter.

      • Legal and Compliance Issues: Responsibility, ownership of data, right to penetration test, local law where data is held, examination of modern Security Standards (eg PCIDSS), how  standards deal with cloud services and virtualization, compliance for the cloud provider vs.compliance for the customer.

      Cloud Storage Infrastructures

      This course provides a comprehensive view of storage and networking infrastructures for highly virtualized cloud ready deployments. The course discusses the concepts and features related to Virtualized datacenter and cloud, Information storage security and design, storage network design and cloud optimized storage.

      Learning Outcomes:

      On successful completion of this course, the learner will be able to

      • Critically appraise the opportunities and challenges of information management in complex business environments.
      • Evaluate information storage management design in a cloud environment and how it relates to the business objectives of an organization.
      • Analyze the role technology plays in the design of a storage solution in a cloud architecture.
      • Investigate how a global storage solution can be optimized so that it can be delivered successfully from the cloud.
      • Analyze how best to provide reliable access to information both locally and remotely using storage technologies.

      Course Contents:

      • Virtualized Data Center Architecture
      • Information Storage Security & Design
      • Storage Network Design
      • Cloud Optimized Storage
      • Information Availability Design

      Occasionally used to refer to the economics of cloud computing, the term “Cloudonomics” was coined by Joe Weinman. The strategic advantages of public utility clouds are fundamentally different than traditional data center environments and private clouds. For individual enterprises, cloud services provide benefits that broadly fall into the categories of lowering overall costs for equivalent services (you pay only for what you use), increased strategic flexibility to meet market opportunities without having to forecast and maintain on-site capacity, and access to the advantages of cloud provider’s massive capacity: instant scalability, parallel processing capability which reduces task processing time and response latency, system redundancy which improves reliability, and better capability to repel botnet attacks. Further, public cloud vendors can achieve unparalleled efficiencies compared to data centers and private clouds because they are able to scale their capacity to address the aggregated demand of many enterprises, each having different peak demand periods. This allows for much higher server utilization rates, lower unit costs, and easier capacity planning netting a much higher return on assets than is possible for individual enterprises. Finally, because the location of the public cloud vendor’s facilities are not tied to the parochial interests of the individual clients, they are able to locate, scale, and manage their operations to take optimum advantage of reduced energy costs, skilled labor pools, bandwidth, or inexpensive real estate.

      This course addresses the various aspects of cloud computing like economics of cloud computing; public vs hybrid vs private cloud; strategic advantages; how to price cloud services; business cases etc

      edhj3he3hjre3j2r

      R for Business Analytics

      During the last decade, the momentum coming from both academia and industry has lifted the R programming language to become the single most important tool for computational statistics, visualization and data science. Worldwide, millions of statisticians and data scientists use R to solve their most challenging problems in fields ranging from computational biology to quantitative marketing. R has become the most popular language for data science and an essential tool for Finance and analytics-driven companies such as Google, Facebook, and LinkedIn.

      Every data analysis technique at your fingertips

      R includes virtually every data manipulation, statistical model, and chart that the modern data scientist could ever need. You can easily find, download and use cutting-edge community-reviewed methods in statistics and predictive modeling from leading researchers in data science, free of charge.

      Create beautiful and unique data visualizations

      Representing complex data with charts and graphs is an essential part of the data analysis process, and R goes far beyond the traditional bar chart and line plot. Heavily influenced by thought leaders in data visualization like Bill Cleveland and Edward Tufte, R makes it easy to draw meaning from multidimensional data with multi-panel charts, 3-D surfaces and more. The custom charting capabilities of R are featured in many of the stunning infographics seen in the New York Times, The Economist, and the FlowingData blog.

      Get better results faster

      Instead of using point-and-click menus or inflexible "black-box" procedures, R is a programming language designed expressly for data analysis. Intermediate level R programmers create data analyses faster than users of legacy statistical software, with the flexibility to mix-and-match models for the best results. And R scripts are easily automated, promoting both reproducible research and production deployments.

      Draw on the talents of data scientists worldwide

      As a thriving open-source project, R is supported by a community of more than 2 million users and thousands of developers worldwide. Whether you're using R to optimize portfolios, analyze genomic sequences, or to predict component failure times, experts in every domain have made resources, applications and code available for free, online.

      Course

      This courses introduces you to the R statistical processing language, including how to install R on your computer, read data from SPSS and spreadsheets, and use packages for advanced R functions.

      The course continues with examples on how to create charts and plots, check statistical assumptions and the reliability of your data, look for data outliers, and use other data analysis tools. Finally, learn how to get charts and tables out of R and share your results with presentations and web pages.

      Course Content

      • What is R?
      • Installing R
      • Data Types
      • Import/Export Data
      • Operators
      • Built-in Functions
      • Control Structures
      • User-defined Functions
      • Data Management
      • Debugging

      Cloud computing fundamentals

      A different way to deliver computer resources

      A revolution is defined as a change in the way people think and behave that is both dramatic in nature and broad in scope. By that definition, cloud computing is indeed a revolution. Cloud computing is creating a fundamental change in computer architecture, software and tools development, and of course, in the way we store, distribute and consume information. The intent of this article is to aid you in assimilating the reality of the revolution, so you can use it for your own profit and well being.

      In the last few years, Information Technology (IT) has embarked on a new paradigm — cloud computing. Although cloud computing is only a different way to deliver computer resources, rather than a new technology, it has sparked a revolution in the way organizations provide information and service.

      Originally IT was dominated by mainframe computing. This sturdy configuration eventually gave way to the client-server model. Contemporary IT is increasingly a function of mobile technology, pervasive or ubiquitous computing, and of course, cloud computing. But this revolution, like every revolution, contains components of the past from which it evolved.

      Thus, to put cloud computing in the proper context, keep in mind that in the DNA of cloud computing is essentially the creation of its predecessor systems. In many ways, this momentous change is a matter of "back to the future" rather than the definitive end of the past. In the brave new world of cloud computing, there is room for innovative collaboration of cloud technology and for the proven utility of predecessor systems, such as the powerful mainframe. This veritable change in how we compute provides immense opportunities for IT personnel to take the reins of change and use them to their individual and institutional advantage.

      What is cloud computing?

      Cloud computing is a comprehensive solution that delivers IT as a service. It is an Internet-based computing solution where shared resources are provided like electricity distributed on the electrical grid. Computers in the cloud are configured to work together and the various applications use the collective computing power as if they are running on a single system.

      The flexibility of cloud computing is a function of the allocation of resources on demand. This facilitates the use of the system's cumulative resources, negating the need to assign specific hardware to a task. Before cloud computing, websites and server-based applications were executed on a specific system. With the advent of cloud computing, resources are used as an aggregated virtual computer. This amalgamated configuration provides an environment where applications execute independently without regard for any particular configuration.

      Why the rush to the cloud?

      There are valid and significant business and IT reasons for the cloud computing paradigm shift. The fundamentals of outsourcing as a solution apply.

      • Reduced cost: Cloud computing can reduce both capital expense (CapEx) and operating expense (OpEx) costs because resources are only acquired when needed and are only paid for when used.
      • Refined usage of personnel: Using cloud computing frees valuable personnel allowing them to focus on delivering value rather than maintaining hardware and software.
      • Robust scalability: Cloud computing allows for immediate scaling, either up or down, at any time without long-term commitment.

      Course Outline

      •     What is Cloud Computing?
      •     Traditional IT Infrastructure
      •     Cloud Infrastructure and Cloud Advantage
      •     Examples of Cloud Advantage
      •     Cloud Companies
      •     Examples of Cloud Services
      •     Use Cases
      •     Cloud Segments: IaaS, PaaS, SaaS
      •     Cloud Deployment Models
      •     IT Roles in Cloud
      •     Governance: How will industry standards be monitored?
      •     Financial impact
      •     Cloud Security

      MACHINE lEARNING

      Enterprise Mobile Application Development using IBM Worklight

      The exponential growth of smart phone usage has created a sudden demand in mobile application development skills in the market. Today, there is handful of platforms (smart phones and tablets) that runs these smart devices. This platform domain is led by Apple iOS, Google Android, Microsoft Windows Phone platforms - with Apple and Android being the front runners. IBM® Worklight® helps you extend your business to mobile devices. It is designed to provide an open, comprehensive platform to build, test, run and manage native, hybrid and mobile web apps. IBM Worklight can help reduce both app development and maintenance costs, improve time-to-market and enhance mobile app governance and security.

      About the course

      In this course, you learn how to use IBM Worklight V6.1 to develop mobile applications that run on an Android or iOS* environment.
      IBM Worklight is part of the IBM MobileFirst strategy. IBM Worklight provides standards-based technologies and tools that can help you efficiently develop, connect, run, and manage applications for smartphones and tablets.
      In this course, you learn about the capabilities of IBM Worklight and how to use them to develop mobile applications by using the IBM Worklight hybrid coding approach. The course begins with overviews of mobile development, IBM Worklight V6.1, and Worklight Studio. You then learn about the essential application programming interfaces (APIs) and tools that provide for the development, back-end integration, security, and management of cross-platform mobile applications. This course covers topics that include IBM Worklight client-side APIs; user interface (UI) frameworks such as jQuery, Dojo, or Sencha Touch; Apache Cordova, integration, authentication techniques, push notification, and deploying and managing applications.
      The hands-on lab exercises throughout the course reinforce lecture content by giving you direct experience in working with IBM Worklight and mobile application development. The exercises cover skills that include installing IBM Worklight Studio, using Apache Cordova to access native device functions, and integrating native pages and web pages. You also gain practice in securing applications and in using the Application Center feature to share applications with your team during development.
      The lab environment for this course uses the Google Android platform.

      Audience:

      This course is designed for web developers who want to create cross-platform dynamic web applications that can run on desktop browsers and on mobile devices, and for application developers who want to create, manage, and deploy mobile applications to Android and iOS* mobile environments by using IBM Worklight V6.1.Students of
      – Engineering (CS, IT)-
      – MCA

      Pre-Requisites:

      Before taking this course, you should have experience in Java or web development with Eclipse, and a good knowledge of the following web technologies:

      • HTML5
      • JavaScript
      • Cascading Style Sheets (CSS) 3

      A basic knowledge of a Mobile Web UI framework, such as Dojo Mobile,

      Contents:

      • This course covers the following topics:
      • Identify a mobile application design type suitable for your application
      • Develop a mobile application to run on an Android or iOS platform by using the IBM Worklight hybrid coding approach
      • Use IBM Worklight client-side APIs for cross-platform portability
      • Use the Apache Cordova framework to access native device functions
      • Use IBM Worklight server-side APIs for back-end integration
      • Include the Dojo Mobile, jQuery Mobile, or Sencha Touch UI frameworks in an application
      • Secure a mobile application by using different IBM Worklight authentication techniques
      • Develop an application that uses push notifications
      • Deploy an application to a production environment
      • Use the IBM Application Center to share applications within an organization

      Please send your Resume to admission@aegis.edu.in 
      Get in Touch with Ritin Joshi and Rupali Dutta at 9022137010

      Application Development and Deployment for Cloud using IBM BlueMix

      Cloud computing - the deployment of network-based applications in a highly flexible, shared IT environment - is becoming a key enabler of better service delivery and greater value in today's business landscape. It offers a number of major advantages over more traditional application deployment models, including the more efficient use of IT and development resources, easier and less costly maintenance, and the ability to deliver consistent services through a variety of channels. Cloud computing also makes it easier for businesses to partner and bring enhanced composite service offerings to market very quickly. But developing cloud-based applications requires new approaches that address the unique requirements of software-as-a-service. IBM Application Development Services for Cloud delivers on the promise of cloud application development by building custom cloud applications from implementation planning to design, development and deployment.

      About the course
      IBM BlueMix, its Platform-as-a-Service (PaaS) to help developers quickly integrate applications and speed deployment of new cloud services. Built on open standards and leveraging Cloud Foundry, IBM BlueMix has made major advancements since being announced in 2014, including: 

      – Cloud integration services to enable a secure connection between an organization's public applications and private data

      – Swift, secure and fast connections, via the cloud, between Internet of Things and machine to machine device to store, query and visualize data

      – Data and analytics-as-a-service to allow the rapid design and scale of applications that turn Big Data into competitive intelligence

      – DevOps services that support the complete developer lifecycle

      IBM BlueMix combining the strength of IBM's middleware software with third-party and open technologies to create an integrated development experience in the cloud. Using an open cloud environment, IBM BlueMix  helps both born-on-the-web and enterprise developers build applications with their choice of tools. With the ability to easily combine a variety of services and APIs from across the industry to compose, test and scale custom applications, developers will be able to cut deployment time from months to minutes. This course is designed to teach you how to build applications for deployment to the cloud platforms using BlueMix.

      Pre-Requisites
      To benefit from this course, participants should have the following skills or experience:

      – Programming using Java
      – Familiarity with Java EE for Web development (HTML, JSPs, and Servlets)
      – Experience using relational databases
      – The design-implement-test-deploy stages of application development 
      – Object-oriented design and analysis

      Course Contents
      This course covers the following topics:

      Introduction to Bluemix

      • Cloud computing overview
      • Consumption View – IaaS (Soft Layer), PaaS (IBM Bluemix),
      • Blue Mix Architecture
      • Blue Mix Overview and Dashboard
      • Setup and installations - Eclipse and CF plugins
      • Lab Exercise: Building an Application from a Boilerplate in the
      • Bluemix UI
      • Deploying a Java web app that uses the PostGreSQL service with the IBM Bluemix Eclipse tools
      • Lab Exercise : Building and Deploying the Java version with the
      • IBM Bluemix Eclipse tools

      Development of Apps using Bluemix Services

      • Registering Services in Bluemix
      • Lab Exercise : Deploying a Node.js app that uses the MySQL service with command line tools
      • Lab Exercise : Build a Twitter Influencer Application in Bluemix
      • Bluemix Eclipse tools

      Development of Apps using Dev Ops Services on Blue Mix

      • Overview of Dev Ops
      • Overview Bluemix DevOps Services
      • Lab Exercise :
      • DevOps
      • Lab Exercise: JEE Cloud Trader Benchmark Application on
      • Bluemix that use performance analysis capabilities

      Bluemix Services in Mobility & Big Data

      • Overview of Services in the areas of Mobile Apps Development & Big Data
      • Lab Exercise: Building an Application with Mobile Backend as a
      • Building and Deploying the Node.js version with the IBM
      • Deploying the Python version with command line tools
      • Part 1: Importing and deploying the application from
      • Part 2: (Optional) Updating the application
      • IBM Career Education Copyright 2014
      • Service (MBaaS) on Bluemix platform
      • Lab Exercise : Data Management service - Build an BI application using Map Reduce Service to perform analytics for Big Data Sets

      Roadmap
      Participants who wish to learn further about Cloud Computing should take the following courses:

      - Advance Application Development and Deployment for Cloud using IBM Bluemix
      - About Cloud Security

      For Admissions:

      Please send your Resume to admission@aegis.edu.in 
      Get in Touch with Ritin Joshi  at 9022137010

      Solution Advisory Course in Cloud Computing

      Cloud computing changes the way we think about technology. Cloud is a computing model providing web-based software, middleware and computing resources on demand. By deploying technology as a service, users can be given access only to the resources they need for a particular task. This prevents from paying for idle computing resources. Cloud computing can also go beyond cost savings by allowing users to access the latest software and infrastructure offerings to foster business innovation. Cloud computing helps enterprises transform business and technology.

      IBM has come up with great and easy ways to understand cloud computing and make it functional as well. Cloud computing can also go beyond cost savings by allowing users to access the latest software and infrastructure offerings to foster business innovation. Think about what that simple statement means. Yes, saving money is good – even important – but even more important is how the end user uses software and infrastructure. Do you know what that really means?

      From the infrastructure side of things, you should be able to offer something that has enough servers, storage and other essential components to get the task at hand done. In this case, the task is setting up a cloud computing structure. Cloud computing is based on several basic ideas. Infrastructure as a service, IaaS, is the basic cloud. Cloud computing starts (or ends) here. IaaS is all about on-demand services for virtual machines, firewalls and networks. PaaS, or platform as a service, is a little deeper, however. This is for solutions; you develop a program on the cloud for you and your employees to use. The service is that someone else is hosting your software for you. SaaS, or software as a service, is where the software that you are using is hosted and run on a virtual machine – there will never be a need for an updated computer after that point.

      About the course
      This course is designed to teach participants the solution advisory concepts and terminology of cloud computing in both IaaS & PaaS. This course describes the various service delivery models of a cloud computing architecture, and the ways in which clouds can be deployed as public, private, hybrid, and community clouds. Participants also learn about the security challenges that cloud deployments experience, and how these are addressed. The course also describes IBM cloud computing architecture and offerings using IBM BlueMix Platform.

      Course Contents
      This course covers the following topics:

      Overview of Cloud Computing Architecture

      • Define cloud computing
      • DEMO: Identifying key characteristics of Cloud: On Demand Service ,
      • Broad Network Access, Resource Pooling
      • Consumption View : Private , Public and Hybrid Cloud Architecture
      • DEMO: Cloud Characteristics: Rapid Elasticity , Measure Services
      • Provider View : SaaS , IaaS , PaaS
      • Understanding Cloud Infrastructure Technologies – System , Storage ,
      • Security , Networking , Virtualization Techniques

      Cloud Computing : Infrastructure as a Service (IaaS Model)

      • Understanding Services Management
      • IaaS Overview
      • IBM Softlayer : Architecture & Characteristics
      • Working with IBM Softlayer
      • Understanding Softlayer Compute : Bare Metal & Virtual Machine
      • Understanding Softlayer Storage
      • Understanding Softlayer Networking
      • Understanding the Commercial Implication of Cloud Technologies :
      • Integration , Reporting , Billing , Metering
      • Software Defined Network – Cloud Perspective

      Cloud Computing : Platform as a Service (PaaS Model)

      • PaaS Overview
      • Application Development Paradigm
      • IBM BlueMix : Overview , API Economy
      • IBM and 3rd Party Services on BlueMix : Overview
      • Creation of User Account on BlueMix and Accessing BlueMix
      • Hands on Development of Sample Apps on BlueMix
      • Best Practices of Apps Development on BlueMix
      • Hands on Deployment of Sample Apps on BlueMix
      • Understanding BlueMix Services : IoT , Mobility , Security etc
      • Client Case Study

      For Admissions:

      Please send your Resume to admission@aegis.edu.in 
      Get in Touch with Ritin Joshi and Rupali Dutta at 9022137010

      Two Year Full Time Programme

      • Commenced in 2006
      • Autonomous Programme - Affiliated to University of Mumbai
      • Approved by AICTE
      • Programme was reaccredited inNovember 2013 for a period of 3 years by National Board of Accreditation (NBA), a body of AICTE, Ministry of HRD, and Govt of India.
      • Caters to the needs of the IT industry and ever-changing business environment
      • Excellent record of placements
      • Admission through MAH-MCA CET ; Test conducted by Directorate of Technical Education (DTE), Govt. of Maharashtra


      Pedagogy:

      1. Hands on practical exercises
      2. Guest lectures by industry experts
      3. Live projects are undertaken.
      4. Case studies, projects, presentations & assignments

      Field Technician: Also called ‘Service Technician’, the Field Technician
      provides after sale support services to customers, typically, at their
      premises.                                                                                                      


      Brief Job Description: The individual at work is responsible for attending
      to customer complaints, installing newly purchased products,
      troubleshooting system problems and, configuring hardware equipment
      such as servers, storage and other related networking devices.
      Personal Attributes: The job requires the individual to have: ability to
      build interpersonal relationships, customer centric approach and critical
      thinking. The individual must be willing to travel to client premises in
      order to attend to calls at different locations.
      Field Technician: Also called ‘Service Technician’, the Field Technician
      provides after sale support services to customers, typically, at their
      premises.                                                                                                  


      Brief Job Description: The individual at work is responsible for attending
      to customer complaints, installing newly purchased products,
      troubleshooting system problems and, configuring hardware equipment
      such as servers, storage and other related networking devices.
      Personal Attributes: The job requires the individual to have: ability to
      build interpersonal relationships, customer centric approach and critical
      thinking. The individual must be willing to travel to client premises in
      order to attend to calls at different locations.


      Prescriptive Analytics

      Prescriptive analytics is the third and final phase of business analytics (BA) which includes descriptive, predictive and prescriptive analytics.

      Prescriptive analytics automatically synthesizes big data, multiple disciplines of mathematical sciences and computational sciences, and business rules, to make predictions and then suggests decision options to take advantage of the predictions. The first stage of business analytics is descriptive analytics, which still accounts for the majority of all business analytics today. Descriptive analytics answers the questions what happened and why did it happen. Descriptive analytics looks at past performance and understands that performance by mining historical data to look for the reasons behind past success or failure. Most management reporting - such as sales, marketing, operations, and finance - uses this type of post-mortem analysis.

      The next phase is predictive analytics. Predictive analytics answers the question what will happen. This is when historical performance data is combined with rules, algorithms, and occasionally external data to determine the probable future outcome of an event or the likelihood of a situation occurring. The final phase is prescriptive analytics,[4] which goes beyond predicting future outcomes by also suggesting actions to benefit from the predictions and showing the implications of each decision option.

      Prescriptive analytics not only anticipates what will happen and when it will happen, but also why it will happen. Further, prescriptive analytics suggests decision options on how to take advantage of a future opportunity or mitigate a future risk and shows the implication of each decision option. Prescriptive analytics can continually take in new data to re-predict and re-prescribe, thus automatically improving prediction accuracy and prescribing better decision options. Prescriptive analytics ingests hybrid data, a combination of structured (numbers, categories) and unstructured data (videos, images, sounds, texts), and business rules to predict what lies ahead and to prescribe how to take advantage of this predicted future without compromising other priorities.

      All three phases of analytics can be performed through professional services or technology or a combination. In order to scale, prescriptive analytics technologies need to be adaptive to take into account the growing volume, velocity, and variety of data that most mission critical processes and their environments may produce.

      Course Coverage:

      • Rule-Based Expert Systems                 
      • Mathematical Optimization

      Introduction to Data Science

      There is much debate among scholars and practitioners about what data science is, and what it isn’t. Does it deal only with big data? What constitutes big data? Is data science really that new? How is it different from statistics and analytics?

      One way to consider data science is as an evolutionary step in interdisciplinary fields like business analysis that incorporate computer science, modeling, statistics, analytics, and mathematics. Data science is the study of the generalizable extraction of knowledge from data, yet the key word is science. It incorporates varying elements and builds on techniques and theories from many fields, including signal processing, mathematics, probability models, machine learning, statistical learning, computer programming, data engineering, pattern recognition and learning, visualization, uncertainty modeling, data warehousing, and high performance computing with the goal of extracting meaning from data and creating data products.

      From government, social networks and ecommerce sites to sensors, smart meters and mobile networks, data is being collected at an unprecedented speed and scale. Data science can put big data to use.

      Factoid_Likes

      Average number of “likes” and “comments” posted on facebook daily.

      Factoid_Percent

      Percentage of the world’s data that has been produced in the last two years.

      Factoid_eComm

      Projected volume of e-commerce transactions in 2016.

      Data Science is not restricted to only big data, although the fact that data is scaling up makes big data an important aspect of data science.

      Course Coverage:

      • Introduction to Data Science
        • Data  in Data science ecosystem
      • Data sets , Training , Testing data sets
        • Volume ,Variety,Velocity and Values
        • Structured , Unstructured Data, Text Data
        • Meta Data Modelling /KDM standard
        • MOF , KDD and KDDML
        • Datamining Group and PMML standard
        • Unified Modelling language - Meta Data modelling
        • Text Data  and Classification
        • Global Standards - UNSPSC , DBPedia
        • Datascience  solution using platform . Iaas , Paas and Saas based solution approach
      • Big Data  and Big Data Technology , Tools and Platform
        • Why Big Data ?
        • Hadoop Framework
        • Map Reduce
        • YARN
        • HDFS
      • Services attached with Hadoop Framework (Hbase , Hive , Zookeeper , Cassandra , MongoDB)
        • API and Its Integration Model
      • Data Science Platform
        • Virtual Infrastructure Platform and Public Cloud
        • AWS Elastic Map Reduce  Platform
        • Apache Spark , Spark SQL  , Apache Storm
        • Machine Learning Platform
        • API design and Model  for platform
        • Data Platform - MapR
      • Data Science Services Platform
        • Data set Design and Model  using UML Infrastructure and MOF
        • KDM and KDDML for Numeric Data
        • Content /Text Data  modeling
        • Text/Content Extraction
        • XML Pipeline for Text aggregation / transformation
        • Semantic Content , Annotation
        • OWL/RDF /RDF Graph standard
        • Ontology , Vocabulary , Linked Platform
        • UIMA and NLP for text analysis
      • Algorithm  and Machine Learning
        • Classification of Data - Support Vector Machine,
        • Clustering of Data - K means
        • Collaborative Filtering  & Recommendation Engine / easyrec engine
        • Business Intelligence Platform

      Natural language processing (NLP) is a field of computer science, artificial intelligence, and linguistics concerned with the interactions between computers and human (natural) languages. As such, NLP is related to the area of human–computer interaction. Many challenges in NLP involve natural language understanding, that is, enabling computers to derive meaning from human or natural language input, and others involve natural language generation.

      Significant growth in the volume and variety of data is due to the accumulation of unstructured text data—in fact, up to 80% of all your data is unstructured text data. Companies collect massive amounts of documents, emails, social media, and other text-based information to get to know their customers better, offer customized services, or comply with federal regulations. However, most of this data is unused and untouched.

      Text analytics, through the use of natural language processing (NLP), holds the key to unlocking the business value within these vast data assets. In the era of big data, the right platform enables businesses to fully utilize their data lake and take advantage of the latest parallel text analytics and NLP algorithms. In such an environment, text analytics facilitates the integration of unstructured text data with structured data (e.g., customer transaction records) to derive deeper and more complete depictions of business operations and customers.

      Course Content:

      • Corpus
      • Unstructured Data Cleansing
      • Unstructured Data Analysis
      • Tagging
      • Vocabulary Mapping
      • Hidden Markov Models

      What is Hadoop? 

      Apache™ Hadoop® is an open source software project that enables distributed processing of large data sets across clusters of commodity servers. It is designed to scale up from a single server to thousands of machines, with very high degree of fault tolerance. Rather than relying on high-end hardware, the resiliency of these clusters comes from the software's ability to detect and handle failures at the application layer.

      It is a data storage processing system that enables data storage, file sharing, data analytics etc. The technology is scalable & enables effective analysis from large unstructured data therefore adding value. With increasing role of social media and internet communication Hadoop is being largely used by various spectrum of companies ranging from Facebook to Yahoo. Other big users of Hadoop include Cloudera, Hortonworks, IBM, Amazon, Intel, Mapr, Microsoft. This technology facilitates its users to handle more data through enhanced storage capacity also enables data retrieval in case of hardware failure.

      Hadoop as a solution is increasingly offering data retrieval and data security features. These features are getting better with time. This is leading to enhanced solution to database management systems (DBMS). Hadoop software is the highest growing market in comparison to hardware and services.

      Who is Using Hadoop? 

      With increasing role of social media and internet communication Hadoop is being largely used by various spectrum of companies ranging from Facebook to Yahoo. This technology facilitates its users to handle more data through enhanced storage capacity also enables data retrieval in case of hardware failure.

      • Yahoo ( One of the biggest user & more than 80% code contributor to Hadoop)
      • Facebook
      • Cloudera
      • Hortonworks
      • IBM
      • Intel
      • Mapr
      • Microsoft
      • Netflix 
      • Amazon
      • Adobe 
      • Ebay
      • Hulu
      • Twitter
      • Snapdeal
      • TataSky

      Why use Hadoop?

      Hadoop changes the economics and the dynamics of large-scale computing. Its impact can be boiled down to four salient characteristics. Hadoop enables a computing solution that is:

      Scalable

       A cluster can be expanded by adding new servers or resources without having to move, reformat, or change the dependent analytic workflows or applications.

      Cost effective

       Hadoop brings massively parallel computing to commodity servers. The result is a sizeable decrease in the cost per terabyte of storage, which in turn makes it affordable to model all your data.

      Flexible

      Hadoop is schema-less and can absorb any type of data, structured or not, from any number of sources. Data from multiple sources can be joined and aggregated in arbitrary ways enabling deeper analyses than any one system can provide.

      Fault tolerant

       When you lose a node, the system redirects work to another location of the data and continues processing without missing a beat.

      Hadoop architecture

      Hadoop is composed of four core components—Hadoop Common, Hadoop Distributed File System (HDFS), MapReduce and YARN.

      Hadoop for the Enterprise from IBM

      IBM BigInsights brings the power of Hadoop to the enterprise, enhancing Hadoop by adding administrative, discovery, development, provisioning, security and support, along with best-in-class analytical capabilities. IBM® BigInsights™ for Apache™ Hadoop® is a complete solution for large-scale analytics. Explore the Big Data Anlytics using IBM Infosphre Big Insight taught by IBM Experts at Aegis. 

      Sample Jobs in Hadoop

      • Hadoop Developer
      • Hadoop Architect
      • Hadoop Tester
      • Hadoop Administrator
      • Data Scientist

      Companies Recruiting

      • IBM
      • Myntra
      • Snapdeal
      • HP
      • EMC
      • Cloudera

      Hadoop Course Overview

      Through lectures, hands-on exercises, case studies, and projects the students will explore the Hadoop ecosystem, learning topics such as:

      • What is Hadoop and the real-world problems it solves
      • Understand MapReduce concepts and how it works
      • Write MapReduce programs
      • Architect Hadoop-based applications
      • Understand Hadoop operations

      Prerequisites and Requirements

      • Programming proficiency in Java or Python is required. Prior knowledge of Apache Hadoop is not required.
      • Primary language during lectures will be Java. However, assignments and projects completed in both Java and Python will be accepted.

      Note: For students who do not have a programming background in Java or Python, additional readings or learning videos will be prescribed. The programming prerequisites will need to be completed within the first 2 weeks of the course.

      Course Contents

      1. Introduction to Hadoop: Real-World Hadoop Applications and Use Cases, Hadoop Ecosystem & projects, Types of Hadoop Processing, Hadoop Distributions, Hadoop Installation.
      2. Hadoop MapReduce: Developing a MapReduce Application, MapReduce Internals, MapReduce I/O, Examples illustrating MapReduce Features.
      3. Hadoop Application Architectures: Data Modeling, Data Ingestion, and Data Processing in Hadoop.
      4. Hadoop Projects: Practical Tips for Hadoop Projects.
      5. Hadoop Operations: Planning a Hadoop Cluster, Installation and Configuration, Security, and Monitoring.
      6. Hadoop Case Studies: A series of Group Discussions and Student Project Presentations.

      References

      1. Lecture notes will form the primary reference material for the course.
      2. Specific reading material, papers, and videos will be assigned during the course to augment the lecture notes.
      3. Reference Textbook: Hadoop: The Definitive Guide, 4th Edition, Tom White, O’Reilly Media, Inc.

      Evaluation

      1. Homework Assignments: Students will be assigned 5 homework assignments during the course: 10%
      2. Quizzes: Best 2 out 3 unannounced quizzes will be counted towards the final grade: 10%
      3. Mid-Term Exam: 20%
      4. Final Exam: 40%
      5. Student Project: The students will individually, or optionally in teams of two, research, design, implement, and present a substantial term project: 20%

        Consult our Big Data Career Advisers +91 704 531 4371 /+91 981 900 8153 on how to add wings to your career

      Pattern Recognition    

      Pattern recognition is nearly synonymous with machine learning. This branch of artificial intelligence focuses on the recognition of patterns and regularities in data. In many cases, these patterns are learned from labeled "training" data (supervised learning), but when no labeled data are available other algorithms can be used to discover previously unknown patterns (unsupervised learning).

      Pattern recognition algorithms generally aim to provide a reasonable answer for all possible inputs and to perform "most likely" matching of the inputs, taking into account their statistical variation. This is opposed to pattern matching algorithms, which look for exact matches in the input with pre-existing patterns. A common example of a pattern-matching algorithm is regular expression matching, which looks for patterns of a given sort in textual data and is included in the search capabilities of many text editors and word processors. In contrast to pattern recognition, pattern matching is generally not considered a type of machine learning, although pattern-matching algorithms (especially with fairly general, carefully tailored patterns) can sometimes succeed in providing similar-quality output to the sort provided by pattern-recognition algorithms.

      Pattern recognition is studied in many fields, including psychology, psychiatry, ethology, cognitive science, traffic flow and computer science.

                  Numerosity Reduction              

                  Dimensionality Reduction                     

                  Supervised Feature Selection

      Armiet_Demo

      Automation Testing

      PHP is a server-side scripting language designed for web development but also used as a general-purpose programming language. PHP is now installed on more than 244 million websites and 2.1 million web servers.

      Originally created by Rasmus Lerdorf in 1995, the reference implementation of PHP is now produced by The PHP Group. While PHP originally stood for Personal Home Page, it now stands for PHP: Hypertext Preprocessor, a recursive backronym.
      PHP code is interpreted by a web server with a PHP processor module, which generates the resulting web page: PHP commands can be embedded directly into an HTML source document rather than calling an external file to process data.

      It has also evolved to include a command-line interface capability and can be used in standalone graphical applications. PHP is free software released under the PHP License. PHP can be deployed on most web servers and also as a standalone shell on almost every operating system and platform, free of charge.

      PHP is a server-side scripting language designed for web development but also used as a general-purpose programming language. PHP is now installed on more than 244 million websites and 2.1 million web servers.

      Originally created by Rasmus Lerdorf in 1995, the reference implementation of PHP is now produced by The PHP Group. While PHP originally stood for Personal Home Page, it now stands for PHP: Hypertext Preprocessor, a recursive backronym.
      PHP code is interpreted by a web server with a PHP processor module, which generates the resulting web page: PHP commands can be embedded directly into an HTML source document rather than calling an external file to process data.

      It has also evolved to include a command-line interface capability and can be used in standalone graphical applications. PHP is free software released under the PHP License. PHP can be deployed on most web servers and also as a standalone shell on almost every operating system and platform, free of charge.

      A computer network or data network is a telecommunications network that allows computers to exchange data. In computer networks, networked computing devices pass data to each other along data connections. The connections (network links) between nodes are established using either cable media or wireless media. The best-known computer network is the Internet.

      Network computer devices that originate, route and terminate the data are called network nodes.[1] Nodes can include hosts such as servers and personal computers, as well as networking hardware. Two devices are said to be networked when a device is able to exchange information with another device.

      Computer networks support applications such as access to the World Wide Web, shared use of application and storage servers, printers, and fax machines, and use of email and instant messaging applications. Computer networks differ in the physical media used to transmit their signals, the communications protocols to organize network traffic, the network's size, topology and organizational intent.

      What is the course?

      Online Course Understand Google Search, Ads and Analytics is a 3 month online course that helps you learn digital marketing tools that will enhance search engine optimization, assist you in creating effective Google Ads and usage of Google Analytics to measure traffic.

      What is the Course Structure?

      • Search Engine Optimization (onsite and offsite)
      • Creating effective Google Ads
      • Use Google Analytics to measure traffic

      Overview: Dance with your Child is a course dedicated to the relationship between child and parent during the formative years of a child's growth. Dance is essential to general education. This education beginning in early childhood and continuing throughout life benefits the body, mind and spirit. Additionally, there are various physical and mental skills learned while dancing which shapes success in everyday life. Self-motivation, leadership, teamwork, communication and time management are just a few of the benefits of this fun filled course.

      Learning Objectives

      • To perceive and communicate who we are through Dance
      • To engage the whole person in simultaneously moving, thinking and feeling. Thus to enhance your child's physical, mental and emotional development. This holds for both parent & child like.
      • To experience music, movement and emotions. In this Dance is almost unique as an art form, and very special as part of a child & parent education.

      Components

      • Physical. Increase flexibility, improve circulation, tone the body and develop muscles. It also improves posture, balance and coordination.
      • Intellectual. Enriches learning through a variety of perspectives, both traditional and experimental.
      • Aesthetic. Awaken consciousness of beauty, lending new meaning to movement and form.
      • Cultural. Increase your understanding and appreciation for forms, choices and rituals from a broad range of historical, social and cultural perspectives.
      • Emotional. Develop self-confidence and self-esteem in a stimulating environment.
      • Social. Improve sensitivity, understanding, appreciation and consideration for others, both for their similarities and differences.

      Course on Technology

      Demonstration course

      First Jury Round 2014:

      • Jury Round: 21-23 January 2014
      • Jury round will be co-hosted with 22nd Convergence India: South Asia's largest ICT Expo to be held at  Pragati Maidan in New Delhi, India.

      Software Engineering

      IT, Management and Innovation

      Master of Science (120 credits) with a major in Informatics

      This two year programme addresses students with either a business or organizational administration background who want to increase their competence in exploiting IT for business, organisational or industrial innovation, or students with a technology background who want to increase their competence in entrepreneurship and innovation.

      The programme features three key areas: entrepreneurial IT; the use of IT in business, organisational and industrial innovation; and the evaluation of IT investments.

      Leaders of today, both for private and public organisations, must have strong analytical and critical thinking skills to thrive in a competitive global environment, skills which are targeted in this programme.

      Further studies: The programme prepares students for doctoral studies.

      Career prospects: The overall purpose is to contribute to educating a new generation of researchers and practitioners who want to understand the value of IT, how to effectively implement and manage innovative IT-based products and services in the global virtual economy, and how to capture the effects of IT.

      After successfully completing the programme, students will have acquired good analytical skills and the ability to think strategically in terms of both business and technology and be qualified to work in international and national companies.

      Facts
      Credits: 120.0 hp
      Level: Second cycle
      Rate of study: Full-time
      Place of study: Jönköping
      Language: English
      Start date: Autumn 2014
      Application code: HJ-MU042

      Requirements: The applicant must hold the minimum of a bachelors degree (i.e the equivalent of 180 credits at an accredited university) with at least 60 credits in informatics, business administration, computer science, computer engineering, information engineering, or equivalent.

      Degree: Degree of Master of Science (120 credits) with a major in Informatics
      Tuition fee for the first semester: SEK 58500
      Total tuition fee: SEK 234000

      Course Overview:

      Clear explanation of the way networks work, from hardware technology up through the most popular network applications and wider view of enterprise networks and solutions. Using the Internet as a vehicle, this course introduces the underlying concepts and principles of modern computer networks, with emphasis on protocols, architectures, and implementation issues. Students learn how to install and configure Cisco switches and routers in local and wide-area networks using various protocols, how to provide Level 1 troubleshooting service, and how to improve network performance and security. Additionally, course provides training in the proper care, maintenance, and use of networking software tools and equipment.

      The course features the following modules:

      — Networking Basics
      — Routers and Routing Basics
      — Switching Basics and Intermediate Routing
      — WAN Technologies

      This course also prepares you for CCNA certification.

      NETWORK SECURITY

      Bachelor of Science Degree
      The Network Security program is designed to prepare students for careers in computer and network security, and to deal with the challenges specific to this area. Areas of emphasis include fundamental security measures necessary in modern business, performing the basic functions of a PC/Network Support Technician, and installation/configuration of hardware and software infrastructure to support a local area network.


      Graduates of this program have a basic understanding of computer hardware, software, programming concepts, and network administration with a primary emphasis on network security.

      Bridge Course

      Software Testing

      Billing as a Service (BaaS)

      In an unconnected world, companies with billing inefficiencies face the prospect of declining revenue and billing leakage. BaaS from Tech Mahindra is a billing platform hosted over cloud, which provides a gamut of billing services to customers and enterprises across all key verticals. BaaS provides faster time to market and lower operational costs to organizations. This course provides a over view of BaaS.

       

      Less is More in the Connected World...

      Video -

      Benefits

      Benefits

       

      • Reduce delay in invoicing
      • Increase CDR rating accuracy for greater savings
      • Minimize revenue loss by plugging-in sources of leakages
      • service assurance with SLA-driven availability and quality of service
      • Managed Services Model ensuring:
        • High level of security and administration
        • Availability of highly skilled and experienced staff
        • Increased operational efficiency with scalable operations

      Features

      Features

       

      • Accurate billing and settlements for all sorts of events, data and content including voice.
      • Effective relationship management with partners and content providers
      • Opex business model as against Capex model makes it a low cost solution to run billing and managed operations.

      Case in Point

      Case in Point

       

      Our Interconnect Billing service helped the largest Indian Telecom player :

      • Reduced timelines in invoicing and increased CDR rating accuracy and enabled saving of up to $2 Million.
      • 50% Lower CapEx by leveraging economies of scale on shared platform.
      • Reduced Cost of Operations up to 50% with Opex based, Interconnect Billing Services Bureau model.

       

      Downloads

      Business Intelligence (BI) is a set of theories, methodologies, architectures, and technologies that transform raw data into meaningful and useful information for business purposes.

      The main objective of this course is to introduce students to fundamentals of database technology by studying databases from three viewpoints: those of the database user, the database designer, and the database administrator. It teaches the use of a database management system (DBMS) by treating it as a black box, focusing only on its functionality and its interfaces. Topics include: introduction to database systems, relational database systems, database design methodology, SQL and interfaces, database application development, concept of transactions, ODBC, JDBC, database tuning, database Administration, and current topics (distributed databases, data warehouses, data mining).

      This course introduces students to the role of information systems and technology strategy in the modern enterprise with a focus on helping users apply technology to achieve and maintain competitive advantage. Basic concepts include the use of systems to support business decision-making, computer hardware and software systems, networks, telecommunications, and e-business basics. Emphasis is on the development of critical thinking and analytical skills through the exploration of real-life business system applications, case studies, and a research project. During the lab component, this course also provides students with a hands-on introduction to basic Microsoft Office computer applications including MS-Access and Excel. Upon completion of this course, students should have the generic computer skills they will need for academic, personal and business success -

      CITM 102 Business Information Systems I


      Certificate and Degree Credit
      Business Program Area


      Antirequisite(s): ITM 101 and ITM 277
      Duration: 39 Hours
      Fee: $867.00* (Payment in full is required at time of enrollment.)
      Available through Distance Education


      This course introduces students to the role of information systems and technology strategy in the modern enterprise with a focus on helping users apply technology to achieve and maintain competitive advantage. Basic concepts include the use of systems to support business decision-making, computer hardware and software systems, networks, telecommunications, and e-business basics. Emphasis is on the development of critical thinking and analytical skills through the exploration of real-life business system applications, case studies, and a research project. During the lab component, this course also provides students with a hands-on introduction to basic Microsoft Office computer applications including MS-Access and Excel. Upon completion of this course, students should have the generic computer skills they will need for academic, personal and business success.

      - See more at: http://ce-online.ryerson.ca/ce/calendar/default.aspx?id=5§ion=course&mode=course&ccode=CITM%20102#sthash.HqhsdnD7.dpuf

      This course introduces students to the role of information systems and technology strategy in the modern enterprise with a focus on helping users apply technology to achieve and maintain competitive advantage. Basic concepts include the use of systems to support business decision-making, computer hardware and software systems, networks, telecommunications, and e-business basics. Emphasis is on the development of critical thinking and analytical skills through the exploration of real-life business system applications, case studies, and a research project. During the lab component, this course also provides students with a hands-on introduction to basic Microsoft Office computer applications including MS-Access and Excel. Upon completion of this course, students should have the generic computer skills they will need for academic, personal and business success - See more at: http://ce-online.ryerson.ca/ce/calendar/default.aspx?id=5§ion=course&mode=course&ccode=CITM%20102#sthash.HqhsdnD7.dpuf

      CITM 102 Business Information Systems I


      Certificate and Degree Credit
      Business Program Area


      Antirequisite(s): ITM 101 and ITM 277
      Duration: 39 Hours
      Fee: $867.00* (Payment in full is required at time of enrollment.)
      Available through Distance Education


      This course introduces students to the role of information systems and technology strategy in the modern enterprise with a focus on helping users apply technology to achieve and maintain competitive advantage. Basic concepts include the use of systems to support business decision-making, computer hardware and software systems, networks, telecommunications, and e-business basics. Emphasis is on the development of critical thinking and analytical skills through the exploration of real-life business system applications, case studies, and a research project. During the lab component, this course also provides students with a hands-on introduction to basic Microsoft Office computer applications including MS-Access and Excel. Upon completion of this course, students should have the generic computer skills they will need for academic, personal and business success.

      Note: Standard course outlines for Information Technology Management (CITM) courses are available on the ITM website.

      Sample Distance Course Outline

      Fall 2013
      Enrollment for Fall 2013 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 07 Sep 2013 - 07 Dec 2013
      Classroom Delivery Option
      Saturday 9:00 A.M. - 12:00 P.M. 07 Sep 2013 - 07 Dec 2013
      Monday 6:30 P.M. - 9:30 P.M. 09 Sep 2013 - 09 Dec 2013
      Wednesday 6:30 P.M. - 9:30 P.M. 11 Sep 2013 - 11 Dec 2013
      Winter 2014
      Enrollment for Winter 2014 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 11 Jan 2014 - 12 Apr 2014
      Classroom Delivery Option
      Monday 6:30 P.M. - 9:30 P.M. 13 Jan 2014 - 14 Apr 2014
      Thursday 6:30 P.M. - 9:30 P.M. 16 Jan 2014 - 17 Apr 2014
      Spring/Summer 2014
      This term is not yet open for enrollment via the Shopping Cart. Priority enrollment for this term begins on March 03, 2014. Regular enrollment begins on March 17, 2014.
      Distance Delivery Option
      Distance: Internet* 03 May 2014-26 Jul 2014
      Classroom Delivery Option
      Tuesday & Thursday 6:30 PM - 9:30 PM 06 May 2014-17 Jun 2014

      *Please note: The online version of this course uses SAM software that must be installed on a PC: the software is not compatible with MAC computers.

      - See more at: http://ce-online.ryerson.ca/ce/calendar/default.aspx?id=5§ion=course&mode=course&ccode=CITM%20102#sthash.HqhsdnD7.dpuf

      CITM 102 Business Information Systems I


      Certificate and Degree Credit
      Business Program Area


      Antirequisite(s): ITM 101 and ITM 277
      Duration: 39 Hours
      Fee: $867.00* (Payment in full is required at time of enrollment.)
      Available through Distance Education


      This course introduces students to the role of information systems and technology strategy in the modern enterprise with a focus on helping users apply technology to achieve and maintain competitive advantage. Basic concepts include the use of systems to support business decision-making, computer hardware and software systems, networks, telecommunications, and e-business basics. Emphasis is on the development of critical thinking and analytical skills through the exploration of real-life business system applications, case studies, and a research project. During the lab component, this course also provides students with a hands-on introduction to basic Microsoft Office computer applications including MS-Access and Excel. Upon completion of this course, students should have the generic computer skills they will need for academic, personal and business success.

      Note: Standard course outlines for Information Technology Management (CITM) courses are available on the ITM website.

      Sample Distance Course Outline

      Fall 2013
      Enrollment for Fall 2013 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 07 Sep 2013 - 07 Dec 2013
      Classroom Delivery Option
      Saturday 9:00 A.M. - 12:00 P.M. 07 Sep 2013 - 07 Dec 2013
      Monday 6:30 P.M. - 9:30 P.M. 09 Sep 2013 - 09 Dec 2013
      Wednesday 6:30 P.M. - 9:30 P.M. 11 Sep 2013 - 11 Dec 2013
      Winter 2014
      Enrollment for Winter 2014 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 11 Jan 2014 - 12 Apr 2014
      Classroom Delivery Option
      Monday 6:30 P.M. - 9:30 P.M. 13 Jan 2014 - 14 Apr 2014
      Thursday 6:30 P.M. - 9:30 P.M. 16 Jan 2014 - 17 Apr 2014
      Spring/Summer 2014
      This term is not yet open for enrollment via the Shopping Cart. Priority enrollment for this term begins on March 03, 2014. Regular enrollment begins on March 17, 2014.
      Distance Delivery Option
      Distance: Internet* 03 May 2014-26 Jul 2014
      Classroom Delivery Option
      Tuesday & Thursday 6:30 PM - 9:30 PM 06 May 2014-17 Jun 2014

      *Please note: The online version of this course uses SAM software that must be installed on a PC: the software is not compatible with MAC computers.

      - See more at: http://ce-online.ryerson.ca/ce/calendar/default.aspx?id=5§ion=course&mode=course&ccode=CITM%20102#sthash.HqhsdnD7.dpuf

      CITM 102 Business Information Systems I


      Certificate and Degree Credit
      Business Program Area


      Antirequisite(s): ITM 101 and ITM 277
      Duration: 39 Hours
      Fee: $867.00* (Payment in full is required at time of enrollment.)
      Available through Distance Education


      This course introduces students to the role of information systems and technology strategy in the modern enterprise with a focus on helping users apply technology to achieve and maintain competitive advantage. Basic concepts include the use of systems to support business decision-making, computer hardware and software systems, networks, telecommunications, and e-business basics. Emphasis is on the development of critical thinking and analytical skills through the exploration of real-life business system applications, case studies, and a research project. During the lab component, this course also provides students with a hands-on introduction to basic Microsoft Office computer applications including MS-Access and Excel. Upon completion of this course, students should have the generic computer skills they will need for academic, personal and business success.

      Note: Standard course outlines for Information Technology Management (CITM) courses are available on the ITM website.

      Sample Distance Course Outline

      Fall 2013
      Enrollment for Fall 2013 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 07 Sep 2013 - 07 Dec 2013
      Classroom Delivery Option
      Saturday 9:00 A.M. - 12:00 P.M. 07 Sep 2013 - 07 Dec 2013
      Monday 6:30 P.M. - 9:30 P.M. 09 Sep 2013 - 09 Dec 2013
      Wednesday 6:30 P.M. - 9:30 P.M. 11 Sep 2013 - 11 Dec 2013
      Winter 2014
      Enrollment for Winter 2014 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 11 Jan 2014 - 12 Apr 2014
      Classroom Delivery Option
      Monday 6:30 P.M. - 9:30 P.M. 13 Jan 2014 - 14 Apr 2014
      Thursday 6:30 P.M. - 9:30 P.M. 16 Jan 2014 - 17 Apr 2014
      Spring/Summer 2014
      This term is not yet open for enrollment via the Shopping Cart. Priority enrollment for this term begins on March 03, 2014. Regular enrollment begins on March 17, 2014.
      Distance Delivery Option
      Distance: Internet* 03 May 2014-26 Jul 2014
      Classroom Delivery Option
      Tuesday & Thursday 6:30 PM - 9:30 PM 06 May 2014-17 Jun 2014

      *Please note: The online version of this course uses SAM software that must be installed on a PC: the software is not compatible with MAC computers.

      - See more at: http://ce-online.ryerson.ca/ce/calendar/default.aspx?id=5§ion=course&mode=course&ccode=CITM%20102#sthash.HqhsdnD7.dpuf

      CITM 102 Business Information Systems I


      Certificate and Degree Credit
      Business Program Area


      Antirequisite(s): ITM 101 and ITM 277
      Duration: 39 Hours
      Fee: $867.00* (Payment in full is required at time of enrollment.)
      Available through Distance Education


      This course introduces students to the role of information systems and technology strategy in the modern enterprise with a focus on helping users apply technology to achieve and maintain competitive advantage. Basic concepts include the use of systems to support business decision-making, computer hardware and software systems, networks, telecommunications, and e-business basics. Emphasis is on the development of critical thinking and analytical skills through the exploration of real-life business system applications, case studies, and a research project. During the lab component, this course also provides students with a hands-on introduction to basic Microsoft Office computer applications including MS-Access and Excel. Upon completion of this course, students should have the generic computer skills they will need for academic, personal and business success.

      Note: Standard course outlines for Information Technology Management (CITM) courses are available on the ITM website.

      Sample Distance Course Outline

      Fall 2013
      Enrollment for Fall 2013 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 07 Sep 2013 - 07 Dec 2013
      Classroom Delivery Option
      Saturday 9:00 A.M. - 12:00 P.M. 07 Sep 2013 - 07 Dec 2013
      Monday 6:30 P.M. - 9:30 P.M. 09 Sep 2013 - 09 Dec 2013
      Wednesday 6:30 P.M. - 9:30 P.M. 11 Sep 2013 - 11 Dec 2013
      Winter 2014
      Enrollment for Winter 2014 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 11 Jan 2014 - 12 Apr 2014
      Classroom Delivery Option
      Monday 6:30 P.M. - 9:30 P.M. 13 Jan 2014 - 14 Apr 2014
      Thursday 6:30 P.M. - 9:30 P.M. 16 Jan 2014 - 17 Apr 2014
      Spring/Summer 2014
      This term is not yet open for enrollment via the Shopping Cart. Priority enrollment for this term begins on March 03, 2014. Regular enrollment begins on March 17, 2014.
      Distance Delivery Option
      Distance: Internet* 03 May 2014-26 Jul 2014
      Classroom Delivery Option
      Tuesday & Thursday 6:30 PM - 9:30 PM 06 May 2014-17 Jun 2014

      *Please note: The online version of this course uses SAM software that must be installed on a PC: the software is not compatible with MAC computers.

      - See more at: http://ce-online.ryerson.ca/ce/calendar/default.aspx?id=5§ion=course&mode=course&ccode=CITM%20102#sthash.HqhsdnD7.dpuf

      CITM 102 Business Information Systems I


      Certificate and Degree Credit
      Business Program Area


      Antirequisite(s): ITM 101 and ITM 277
      Duration: 39 Hours
      Fee: $867.00* (Payment in full is required at time of enrollment.)
      Available through Distance Education


      This course introduces students to the role of information systems and technology strategy in the modern enterprise with a focus on helping users apply technology to achieve and maintain competitive advantage. Basic concepts include the use of systems to support business decision-making, computer hardware and software systems, networks, telecommunications, and e-business basics. Emphasis is on the development of critical thinking and analytical skills through the exploration of real-life business system applications, case studies, and a research project. During the lab component, this course also provides students with a hands-on introduction to basic Microsoft Office computer applications including MS-Access and Excel. Upon completion of this course, students should have the generic computer skills they will need for academic, personal and business success.

      Note: Standard course outlines for Information Technology Management (CITM) courses are available on the ITM website.

      Sample Distance Course Outline

      Fall 2013
      Enrollment for Fall 2013 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 07 Sep 2013 - 07 Dec 2013
      Classroom Delivery Option
      Saturday 9:00 A.M. - 12:00 P.M. 07 Sep 2013 - 07 Dec 2013
      Monday 6:30 P.M. - 9:30 P.M. 09 Sep 2013 - 09 Dec 2013
      Wednesday 6:30 P.M. - 9:30 P.M. 11 Sep 2013 - 11 Dec 2013
      Winter 2014
      Enrollment for Winter 2014 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 11 Jan 2014 - 12 Apr 2014
      Classroom Delivery Option
      Monday 6:30 P.M. - 9:30 P.M. 13 Jan 2014 - 14 Apr 2014
      Thursday 6:30 P.M. - 9:30 P.M. 16 Jan 2014 - 17 Apr 2014
      Spring/Summer 2014
      This term is not yet open for enrollment via the Shopping Cart. Priority enrollment for this term begins on March 03, 2014. Regular enrollment begins on March 17, 2014.
      Distance Delivery Option
      Distance: Internet* 03 May 2014-26 Jul 2014
      Classroom Delivery Option
      Tuesday & Thursday 6:30 PM - 9:30 PM 06 May 2014-17 Jun 2014

      *Please note: The online version of this course uses SAM software that must be installed on a PC: the software is not compatible with MAC computers.

      - See more at: http://ce-online.ryerson.ca/ce/calendar/default.aspx?id=5§ion=course&mode=course&ccode=CITM%20102#sthash.HqhsdnD7.dpuf

      CITM 102 Business Information Systems I


      Certificate and Degree Credit
      Business Program Area


      Antirequisite(s): ITM 101 and ITM 277
      Duration: 39 Hours
      Fee: $867.00* (Payment in full is required at time of enrollment.)
      Available through Distance Education


      This course introduces students to the role of information systems and technology strategy in the modern enterprise with a focus on helping users apply technology to achieve and maintain competitive advantage. Basic concepts include the use of systems to support business decision-making, computer hardware and software systems, networks, telecommunications, and e-business basics. Emphasis is on the development of critical thinking and analytical skills through the exploration of real-life business system applications, case studies, and a research project. During the lab component, this course also provides students with a hands-on introduction to basic Microsoft Office computer applications including MS-Access and Excel. Upon completion of this course, students should have the generic computer skills they will need for academic, personal and business success.

      Note: Standard course outlines for Information Technology Management (CITM) courses are available on the ITM website.

      Sample Distance Course Outline

      Fall 2013
      Enrollment for Fall 2013 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 07 Sep 2013 - 07 Dec 2013
      Classroom Delivery Option
      Saturday 9:00 A.M. - 12:00 P.M. 07 Sep 2013 - 07 Dec 2013
      Monday 6:30 P.M. - 9:30 P.M. 09 Sep 2013 - 09 Dec 2013
      Wednesday 6:30 P.M. - 9:30 P.M. 11 Sep 2013 - 11 Dec 2013
      Winter 2014
      Enrollment for Winter 2014 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 11 Jan 2014 - 12 Apr 2014
      Classroom Delivery Option
      Monday 6:30 P.M. - 9:30 P.M. 13 Jan 2014 - 14 Apr 2014
      Thursday 6:30 P.M. - 9:30 P.M. 16 Jan 2014 - 17 Apr 2014
      Spring/Summer 2014
      This term is not yet open for enrollment via the Shopping Cart. Priority enrollment for this term begins on March 03, 2014. Regular enrollment begins on March 17, 2014.
      Distance Delivery Option
      Distance: Internet* 03 May 2014-26 Jul 2014
      Classroom Delivery Option
      Tuesday & Thursday 6:30 PM - 9:30 PM 06 May 2014-17 Jun 2014

      *Please note: The online version of this course uses SAM software that must be installed on a PC: the software is not compatible with MAC computers.

      - See more at: http://ce-online.ryerson.ca/ce/calendar/default.aspx?id=5§ion=course&mode=course&ccode=CITM%20102#sthash.HqhsdnD7.dpuf

      CITM 102 Business Information Systems I


      Certificate and Degree Credit
      Business Program Area


      Antirequisite(s): ITM 101 and ITM 277
      Duration: 39 Hours
      Fee: $867.00* (Payment in full is required at time of enrollment.)
      Available through Distance Education


      This course introduces students to the role of information systems and technology strategy in the modern enterprise with a focus on helping users apply technology to achieve and maintain competitive advantage. Basic concepts include the use of systems to support business decision-making, computer hardware and software systems, networks, telecommunications, and e-business basics. Emphasis is on the development of critical thinking and analytical skills through the exploration of real-life business system applications, case studies, and a research project. During the lab component, this course also provides students with a hands-on introduction to basic Microsoft Office computer applications including MS-Access and Excel. Upon completion of this course, students should have the generic computer skills they will need for academic, personal and business success.

      Note: Standard course outlines for Information Technology Management (CITM) courses are available on the ITM website.

      Sample Distance Course Outline

      Fall 2013
      Enrollment for Fall 2013 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 07 Sep 2013 - 07 Dec 2013
      Classroom Delivery Option
      Saturday 9:00 A.M. - 12:00 P.M. 07 Sep 2013 - 07 Dec 2013
      Monday 6:30 P.M. - 9:30 P.M. 09 Sep 2013 - 09 Dec 2013
      Wednesday 6:30 P.M. - 9:30 P.M. 11 Sep 2013 - 11 Dec 2013
      Winter 2014
      Enrollment for Winter 2014 is currently open. Online enrollment for specific classes is only available until the first day of the class in question. See Enrollment for instructions on enrolling online, by mail, or in person.
      Undergraduate Students: Please enroll using RAMSS for courses applicable to your degree program.
      Distance Delivery Option
      Distance: Internet 11 Jan 2014 - 12 Apr 2014
      Classroom Delivery Option
      Monday 6:30 P.M. - 9:30 P.M. 13 Jan 2014 - 14 Apr 2014
      Thursday 6:30 P.M. - 9:30 P.M. 16 Jan 2014 - 17 Apr 2014
      Spring/Summer 2014
      This term is not yet open for enrollment via the Shopping Cart. Priority enrollment for this term begins on March 03, 2014. Regular enrollment begins on March 17, 2014.
      Distance Delivery Option
      Distance: Internet* 03 May 2014-26 Jul 2014
      Classroom Delivery Option
      Tuesday & Thursday 6:30 PM - 9:30 PM 06 May 2014-17 Jun 2014

      *Please note: The online version of this course uses SAM software that must be installed on a PC: the software is not compatible with MAC computers.

      - See more at: http://ce-online.ryerson.ca/ce/calendar/default.aspx?id=5§ion=course&mode=course&ccode=CITM%20102#sthash.HqhsdnD7.dpuf

      This is a test PHP course.

      Aegis Graham Bell Awards 2016:

      The Aegis School of Telcom, Data Science has established the Aegis Graham Bell Awards as a tribute to the father of the telephony, Sir Alexander Graham Bell. This Award is intended to promote innovation, entrepreneurship in the field of Telecom, Social, Mobility, Analytics, Cloud, Security to provide recognition for outstanding contributions in this field in India. The Awards are organized with support of Cellular Operators Association of India (COAI), and Telecom Centres of Excellence (TCOE) India a public private partnership initiative of DoT with 08 Centres at premier academic institutes (06 IITs, IISc and IIMA) of the country, each supported by a Telecom Service Provider and knowledge partner Deloitte.

      IPTV training provides a detailed understanding of the Internet Protocol Television (IPTV) technology IPTV, or Internet Protocol television, is an IP-based technology that many service providers are using to deliver TV service. IP technology means that your TV, PC, home phone and wireless devices could all be integrated to inter-work with more control and more personalization.

      Fiber Optic Access is generally referred as FTTx (where x stands for H-Home, B-Building, C- Curb) and is becoming very popular for triple play service where Voice, Data and Video all can go on a single fiber. This course covers fundamentals of PON networks, features, architecture and applications of FTTx technology.

      Dense Wavelength Division Multiplexing (DWDM) is a key component of the world’s communications infrastructure. In this course you will learn about the technology, architecture, and applications of DWDM. DWDM system components and end-to-end optical network design process. You will also learn aspects of optical control, network management, and practical deployment issues.

      This 2 days course equips participants with basic to advanced concepts of SDH.

      This 3 days course equips participants with Fiber technology, using Splicing Machine & OTDR equipments in field and efficiently handling a simple fiber optic network.

      India is currently the world’s second-largest telecommunications market and has registered strong growth in the past decade and half. The Indian mobile economy is growing rapidly and will contribute substantially to India’s gross domestic product (GDP), according to report prepared by GSM Association (GSMA) in collaboration with the Boston Consulting Group (BCG).

      The liberal and reformist policies of the Government of India have been instrumental along with strong consumer demand in the rapid growth in the Indian telecom sector. The government has enabled easy market access to telecom equipment and a fair and proactive regulatory framework that has ensured availability of telecom services to consumer at affordable prices. The deregulation of foreign direct investment (FDI) norms has made the sector one of the fastest growing and a top five employment opportunity generator in the country.

      Operational Support Systems(OSS)

      Software (occasionally hardware) applications that support back-office activities which operate a telco’s network, provision and maintain customer services.

      OSS is traditionally used by network planners, service designers, operations, architects, support, and engineering teams in the service provider. Increasingly product managers and senior staff under the CTO or COO may also use or rely on OSS to some extent.

      Business Support Systems(BSS)

      Software applications that support customer-facing activities. Billing, order management, customer relationship management, call centre automation, are all BSS applications.

      BSS may also encompass the customer-facing veneer of OSS application such as trouble-ticketing and service assurance – these are back-office activities but initiated directly by contact with the customer.

      This basic relationship between OSS and BSS, where OSS is passed service orders and supplies service assurance information to the BSS layer is often referred to as ‘Orders Down, Faults Up’.

      Telecom Orientation

      Importance of Big Data in Telecom Industry

      Indian telecom industry has crossed two decades post privatization of this sector. Since then a lot of innovation, consolidation, and maturation have happened in the industry, and today we have 12 major mobile telecom operators operating in the country. Presently, the total revenue of telecom operators is about ₹ 1.8 trillion with a burden of ₹ 2.5 trillion debt with a dwindling voice and SMS revenue. This is happening due to severe tariff competition in case of voice and declining SMS revenue due to the advent of new instant messaging applications, etc. Operators are facing disruptive technologies, rapidly changing business rules, intensified regulatory environment leading to eroding service margins. In this scenario, the only stabilizing factor for telecom operators is revenue generated from data provisioning and driving value from this data. Telecom operators can use advanced analytics on customer and network data to generate a real-time view of customer preferences and network efficiency. This could empower them to make near real-time and fact-based decisions and hence enable a forward looking, focused, decisive, and action-oriented culture in the company. \

      Benefits of Big Data for Telecom Value Chain

      Big data analytics brings in considerable value to decision making and provide more accurate and actionable insights which eventually help to build competitive advantages and a more efficient cost structure. In comparison to traditional data warehousing technologies, big data offers following advantages/ opportunities for telecom operators:
      1. Prepare your networks for future demands: Big data helps businesses take advantage of the available information within their networks in order to make them robust, optimized, and scalable. It can help optimize routing and quality of service by analyzing network traffic in real time. Reviewing a network from a smartphone perspective helps reveal areas that require improvement. A revision of 3G-capable smartphone users running their devices on 2G could identify ways of making significant improvements based on intelligent analysis. Examination of user behavior can also play an important role in understanding how to better deliver media content, and thus directly impacting customer experience. 

      2. Understand customer experience: Big data, with its capabilities, makes it easier to understand your customers in detail right from network data and social media information, which further helps establish customer-centric KPIs which enables to understand user experience. Insights can be deployed in customer call centers to better answer and solve concerns for the customer on the phone and allow them to flexibly and profitably modify subscriber calling plans immediately. They can be used to tailor marketing campaigns to individual customers using location-based and social networking technologies.
      They can also be used in Network Operations Centers (NOCs) to discover and handle any issues for bigger groups of users. Where relevant, information can be collected about the users’ experience in real time. For example, call center staff could see if the customer on the phone has experienced problems at a particular location or while using a particular service. The result is an improved customer service, higher levels of customer satisfaction, and a decreased churn rate. Big data can also help analyze call data records in real time to identify fraudulent behavior immediately.

      Challenges Faced by Telecom Operators

      Big data has the potential to place communications services providers (CSPs) in a prime position to win the battle for customers and create new revenue streams. It provides them with a wealth of information about their customers’ behaviors, preferences, and movements. Yet, many CSPs struggle to fully derive the greatest value from big data.

      At a macro level, the big data analytics throws many challenges for CSPs because of its variety, velocity, and complexity.
      • Variety: The social media networks, connected devices, government portals, call data records, billing information, etc., produce huge amount of data as shown in figure above. Most of the data coming from different sources is unstructured. The telecom players need to enrich their Call Data Records (CDR) with other information like locationbased service, financial information, etc., to standardize the data for business intelligence platforms before the analysis can be done on it.
      • Velocity: Every minute, Indians spend ₹ 1.85 million to shop online. Almost 100 hours of video is shared on YouTube every minute. The average time spent by a social media user in India is 2.5 hours a day. All this points to the fact that the data generation speed is tremendously high and to gain value from this data, it needs to be processed in proper timeframe. This volume of data requires new real-time operational capabilities for various functions and that in turn demands increased data storage for compliance and potential future uses as well as new tools for mediating, managing, and archiving data within available time frames.

      • Complexity: The user generated data is mostly unstructured and complex because of the lack of standard format to store data. The legacy network and storage devices do not have any specific format to store data which can be relevant for advanced analytics. The data varies with demographics, geography, life style, etc. Analytics may provide unwanted results if the data is not filtered properly. At a micro level, telecom companies faces other challenges while adopting big data for advanced analytics. 

      Big data analytics need professional data scientists who can understand the technology of data analytics and marry it with the business objectives of telecom operator. While there is a shortage of skilled analytics professionals, small operators find it unaffordable to hire data scientists. According to analysts, analytics-based companies in India are expected to face a shortage of 200,000 data scientists. The infrastructure for data analytics needs high computational capabilities and storage space. It also needs flexibility to analyze different formats of data. Telecom operators tend to ignore these requirements as it is not a part of their core business and also to avoid more capital expenditures. Transitioning the historical data from legacy system to new system is a challenge. Data quality is also a roadblock as different equipment provides data in different formats. Data could be inaccurate, and maintaining appropriate quality of data is a mammoth task for every company. The governance and privacy issues are other major challenges as customers prefer not to share their personal data. Government policies and regulations restrict the operators for independent use of the data.

        

      Certification from TSSC (Telecom Sector Skill Council), a non-profit industry driven body set up under the aegis of the NSDC (National Skill Development Council), at the completion of the program.

      Course Overview:

      This is a comprehensive live video based interactive program on Next Generation Networks. This course will provide the candidates with in-depth understanding of Next Generation Network (NGNs) technologies. It will also focus on Broadband and Next Generation Networks (NGN) from technology, regulation and business aspects.

      It will cover the fixed broadband Internet access, including DSL, cable and FTTH, as well as regulation and business aspects of broadband networks and services. Further, the course will include NGN architectures and protocols, Quality of Service (QoS) in NGN, security mechanisms, transition of PSTN and IP networks to NGN, IPv6-based NGN, as well as business challenges for the Next Generation Networks.

       It will also cover the mobile broadband and NGN, including 3GPP mobile broadband (LTE/LTE-Advanced), IEEE mobile broadband (Mobile WiMAX, WiFi), Fixed-Mobile Convergence (FMC), IMS (IP Multimedia Subsystem) for NGN, mobility and location management, NGN mobile services, as well as regulation and business aspects.

      The course will incorporate broadband and NGN services (Next Generation VoIP, IPTV over NGN, Web Services in NGN, Web 2.0, Cloud Computing, Internet of things, Web of things), as well as business and regulation challenges for broadband and NGN services.

      Course Padagogy: 

      • Course Duration: 18 hrs
      • Credit Hours: 3 Credit Unit
      • 6 Online Live Interactive Web Based Video Sessions of  3 hrs each on every Saturday & Sunday

      Take Away! 

      • A book on "Everything over IP(EOIP): All you wanted to know about NGN" written by Sayta N Gupta
      • Hand Signed Certificate by NGN Guru Satya N Gupta

       

        

      Certification from TSSC (Telecom Sector Skill Council), a non-profit industry driven body set up under the aegis of the NSDC (National Skill Development Council), at the completion of the program.

      Course Overview:

      This course introduces the concept of Managed Services (MS) in telecom domain. Commencing the journey from non MS era, it takes the students and attendees through an interesting transformation path where MS sets in and becomes a part & parcel of the Telecom Industry today. It discusses the drivers of MS in the industry, the initial concerns of the Service Providers, and the critical strategy for choosing right MS Partner, MS Project Implementation & Governance during the life cycle of MS. It also describes MS CONTRACT and its importance for the health of MS.

      The course would take you through:

      - The Concept of Managed Services.
      - Traditional approach to Network Operations.
      - Challenges faced by incumbent Service Providers.
      - Drivers of Managed Services.
      - Concerns on Outsourcing .
      - How to choose the right MS partner.
      - Importance of MS Project Management.
      - Importance of MS Program Management in the life cycle of MS.
      - Types of MS.
      - Managed Services Contract with SLAs and KPIs.
      - IT domain Managed Services.
      - A set of relevant MS case studies by students with faculty guidance........ And many more.

      Objectives:

      - Impart awareness of various aspects of MS in Telecom domain.

      Skills & Competence to be acquired:

      - Complete end to end understanding of MS concepts.
      - Exhaustive knowledge on critical aspects of MS implementations and
         governance process for MS.

      - Develop competence to manage MS Deal from either Service Providers OR
         from MS vendor’s side.

      Who Should Attend:

      - Telecom Engineers 

      - MS Professionals 

      - Telecom Managers

      - Consultants and Students

       

        

      Certification from TSSC (Telecom Sector Skill Council), a non-profit industry driven body set up under the aegis of the NSDC (National Skill Development Council), at the completion of the program.

      Telecom Network Management: Course Overview

      • This course has carefully been designed keeping in view of the need of a telecom professional in the contemporary telecom domain. The contents have carefully been woven to address the perspectives of both technical & business management professionals.  So, independent of whether a telecom professional of today is in technical management or in business management arena, opting for this course is a must.
      • To bring all readers in the same platform, this course starts with basic concepts like telecom network definition, basic aspects of setting up a non-trivial network.
      • It answers a few basic questions like why there is a need to manage a telecom network and if so, how to accomplish that.
      • Gradually, it takes the reader to the architecture of a Network Management System in its traditional form. The reader then travels to an arena to find a few key operational aspects.
      • This would give them an exposure on the deficiency of the traditional approach of network management.
      • The reader would be delighted to know how the scientific & technical world step in and bring the concepts of operational models like TOM and eTOM.
      • They would get an understanding of Operations Support System (OSS) and Business Support Systems (BSS) as well.

      Course Curriculum Topics:Telecom Network – A Definition.

      • Steps to setup a Telecom Network.
      • Need to Manage a Telecom Network.
      • How to manage a Telecom Network ?
      • Simple Network Management Protocol (SNMP).
      • Functional Definition of Network Management.
      • Traditional OA&M and Operation & Maintenance Centres (OMCs).
      • Telecommunication Management Network (TMN) Architecture.
      • Network Management System & Network Operation Centre (NOC).
      • Challenges faced by Incumbent Network Operators.
      • Operation Support System (OSS) – An Overview.
      • TOM model from TMForum.
      • eTOM model from TMForum.
      • A Few Applications from OSS repertoir.
      • Business Support System – An Overview. 

        

      Certification from TSSC (Telecom Sector Skill Council), a non-profit industry driven body set up under the aegis of the NSDC (National Skill Development Council), at the completion of the program.

      Course Overview:

      This is a comprehensive live video based interactive program on Next Generation Networks. This course will provide the candidates with in-depth understanding of Next Generation Network (NGNs) technologies. It will also focus on Broadband and Next Generation Networks (NGN) from technology, regulation and business aspects.

      It will cover the fixed broadband Internet access, including DSL, cable and FTTH, as well as regulation and business aspects of broadband networks and services. Further, the course will include NGN architectures and protocols, Quality of Service (QoS) in NGN, security mechanisms, transition of PSTN and IP networks to NGN, IPv6-based NGN, as well as business challenges for the Next Generation Networks.

       It will also cover the mobile broadband and NGN, including 3GPP mobile broadband (LTE/LTE-Advanced), IEEE mobile broadband (Mobile WiMAX, WiFi), Fixed-Mobile Convergence (FMC), IMS (IP Multimedia Subsystem) for NGN, mobility and location management, NGN mobile services, as well as regulation and business aspects.

      The course will incorporate broadband and NGN services (Next Generation VoIP, IPTV over NGN, Web Services in NGN, Web 2.0, Cloud Computing, Internet of things, Web of things), as well as business and regulation challenges for broadband and NGN services.

      Course Padagogy: 

      • Course Duration: 18 hrs
      • Credit Hours: 3 Credit Unit
      • 6 Online Live Interactive Web Based Video Sessions of  3 hrs each on every Saturday & Sunday

      Take Away! 

      • A book on "Everything over IP(EOIP): All you wanted to know about NGN" written by Sayta N Gupta
      • Hand Signed Certificate by NGN Guru Satya N Gupta

      Course Overview:

      This course introduces the concept of Managed Services (MS) in telecom domain. Commencing the journey from non MS era, it takes the students and attendees through an interesting transformation path where MS sets in and becomes a part & parcel of the Telecom Industry today. It discusses the drivers of MS in the industry, the initial concerns of the Service Providers, and the critical strategy for choosing right MS Partner, MS Project Implementation & Governance during the life cycle of MS. It also describes MS CONTRACT and its importance for the health of MS.

      The course would take you through:

      - The Concept of Managed Services.
      - Traditional approach to Network Operations.
      - Challenges faced by incumbent Service Providers.
      - Drivers of Managed Services.
      - Concerns on Outsourcing .
      - How to choose the right MS partner.
      - Importance of MS Project Management.
      - Importance of MS Program Management in the life cycle of MS.
      - Types of MS.
      - Managed Services Contract with SLAs and KPIs.
      - IT domain Managed Services.
      - A set of relevant MS case studies by students with faculty guidance........ And many more.

      Objectives:

      Impart awareness of various aspects of MS in Telecom domain.

      Skills & Competence to be acquired:

      - Complete end to end understanding of MS concepts.
      - Exhaustive knowledge on critical aspects of MS implementations and 
         governance process for MS.

      - Develop competence to manage MS Deal from either Service Providers OR 
         from MS vendor’s side.

      Who Should Attend:

      Telecom Engineers 

      - MS Professionals 

      - Telecom Managers

      - Consultants and Students 

       

      Course Overview:

      This course teaches the nature and workings of financial markets and their use by corporations, investors and others. They will acquire some skills in modern valuation techniques, including the pricing of fixed-income securities, equities, foreign exchange and derivatives. They will learn about the principles of finance, including arbitrage, market efficiency, and portfolio theory. In the context of corporate finance, the course will introduce the key principles of selecting real investments, financing them, and managing financial risk. From the point of view of investors, individual as well as institutional, we will consider the principles of portfolio selection and management. Finally, the course will look at how banks and other financial institutions make money by bringing issuers and investors together.

      TATA

      Telecom Management

      Course Overview:

      This course covers the regulatory framework of Telecommunications sector to provide an understanding of how the Telecom sector is regulated with the objective to protect consumer interest, while allowing all other stakeholders to make adequate profits to run business operations. Various statutes that regulate telecom, the regulatory and telecom policy making agencies, the licensing of Telecommunication services and the role of regulatory and policy making agencies like the TRAI (Telecom Regulatory Authority of India),TDSAT( Telecom Disputes Settlement and Appellate Tribunal),DOT(Department of Telecom) and Telecom Commission are also discussed during the course.

      This course also includes a detailed study of various Licenses in Telecommunication sector, Tariff regulating mechanism, Quality of Service performance by Licensees, model Inteconnect agreement of Basic and Cellular Mobile Telephone service, TRAI act, Interconnect Agreement and USO (Universal service obligation)

      Course Overview:
      - Statutes that regulate Communication
      Regulatory and Policy making agencies
      Regulatory and Tariff setting Functions of TRAI
      Licenses for Telecommunication services.
      USO (Universal service obligation)
      Telecom Numbering Regulation.

      Course Objective:

      To understand Regulatory aspects of Telecom Sector in terms of Licensing framework and its compliance.

      Who Should Attend:
      Decision-makers and 
      senior policy-makers
      - Managers and regulatory staff in telecom industry
      - Service providers
      - New and established operators and suppliers 
      - Developer & Engineers in telecommunications
      - Researchers, Consultants and Students

      What Will You Learn: 
      - Licenses in Telecommunication sector
      - Tariff regulating mechanism
      - Quality of Service performance by Licensees
      - Model Inteconnect agreement of Basic and Cellular Mobile Telephone service
      - TRAI Act
      - Interconnect Agreement and USO (Universal service obligation)

      Do you Know?

      • Do you know the Telecom Software industry like TCS, TechMahindra, Amdocs etc together generated INR 25,000 Crore in 2013.
      • Do you know Reliance Jio will be investing INR 70,000 Crore in 4G network.
      • Do you know Telecom companies in Africa has turned into banks.
      • Do you know 21% patents filled in only in Mobility area.
      • Do you know Microsoft has emerged has most influencer in Mobile and Telecom space.
      • Do you know where IT/Software ends from there Mobile/Telecom starts.
      • Do you know why Telecom companies do not aggresively conducts campus interviews.

      Agenda

      • Career in Telecom and Mobile Industry.
      • Know the current trends in the industry Various Jobs opportunities.
      • Know the myths and facts about the Telecom and mobile industry.
      • Know the most competitive and challenging industry to work with.

      Customer Care Executive (Repair Centre)

      Customer Care Executive in the Handset industry is also known as Customer Service Representative/Showroom Executive/Customer Relationship Officer/ Customer Service Executive/Repair Centre Executive.

      Brief Job Description: Individuals at this job provide customer service by interacting with walk-in customers. They also handle, follow-up and resolve customer’s queries, requests and complaints, in a timely manner.

      Personal Attributes: This job requires the individual to have good communication skills with a clear diction;ability to construct simple and rational sentences; ability to comprehend simple English sentences;regional language proficiency; strong customer service focus; pleasant personality; should be self motivated and a team player with ability to work under pressure.

      Customer Care Executive (Relationship Centre)

      Customer Care Executive (Relationship Centre) in the Telecom industry is also known as Customer Service Representative / Customer Care Associate / Showroom Executive / Customer Relationship Officer / Customer Service Executive / Store Executive / Retail Executive.

      Brief Job Description:Individuals at this job provide customer service by handling, following and resolving walk-in customer’s queries, requests and complaints and proactively recommend/sell organization’s products and services.

      Personal Attributes: This job requires the individual to have good communication skills with a clear diction, ability to construct simple and rational sentences; ability to comprehend simple English sentences;good problem solving skills; strong customer service focus;strong selling & listening skills and ability to work under pressure.

      Sales Executive (Broadband)

      Sales Executive (Broadband) in the telecom industry is also known as Territory Sales Executive/ Territory Sales Representative/ Field Sales Executive/ Field Sales Representative/ Feet on Street (FOS)/ Business Development Executive.

      Brief Job Description: This role is outsourced to a channel partner such as a Consultancy/DSA. Individual at this job identifies the prospect (potential buyer) and sells broadband/landline services to them.

      Personal Attributes: Individual in this role must possess good communication skills; must be self confident, proactive and customer centric. Individual must be aware of different selling styles like door to door sales, suspecting and prospecting.

      Certificated Customer Care Executive(Call Center)

      Customer Care Executive in the Telecom industry is also known as Customer Service Representative/ Customer Service Associate / Customer Service Advisor / Customer Relationship Officers / Call Centre Executive

      Brief Job Description: Individuals at this job provide customer service support to an organization by interacting with their customers over the phone. They also handle, follow and resolve customer’s queries, requests and complaints in a timely manner.

      Personal Attributes: This job requires the individual to have good communication skills with a clear diction, ability to construct simple and sensible sentences; ability to comprehend simple English sentences; good problem solving skills and ability to approach problems logically; strong customer service focus; ability to work under pressure and active listening skills. The individual should also be willing and comfortable to work in shifts.

      Course Overview: 

      • Ice Breaker/Meeting People, Significance of Help Desk, Telecom Industry overview, CRM Overview, Knowledge on Intranet tools & Telephony Applications, The Importance of the role Of CCE, Learning Principle, Segmentation Of Customers, Common Terms In Telecom & Contact centre, Product Offerings in Telecom, Work place Ergonomics
      • Basic working Knowledge of Computers, Knowledge of handling calls, Database fetching, CRM Understanding, Customer Service Department in a Telecom
      • Reading skills, Writing Skills, Comprehension skills, Oral Communication
      • Interactions Analytics, Handsets & generation of technology, Understanding pre paid & Post paid, Computer Navigation, Cellular Network, Customer Centricity, Problem Solving
      • Objection Handling During Interactions, Selling Skills
      • Product offerings in Telecom, ACD, VAS, Esclation Process, Honest Communication, Learning Principle, Work place Ergonomics
      • Call Quality Paremeters for Inbound, Knowledge of Computers, Database fetching, Knowledge of Cellular Network
      • Elements of Outbound Call, Importance elements during Call opening
      • Decision Making, Navigation Skills, Importance of Help Desk, Call Quality parameters, Esclation Process, Customer Retention, Professional Assertiveness
      • Telecom Technologies, Service Solutions
      • PRE-PAID-Life cycle, documentation& services, RANDOM REVISION, Hands On Listening
      • Managing Customer centricity through Quality Principles, Listening to recorded calls On Product Complaints followed by debrief role-play on objection/ complaint handling
      • Revisting the CRM, Debriefing, Complaince Parameter, Performance Analysis, Computer Knowledge
      • Role-play using a dummy CRM for log in using a unique id & password take a dummy call & wrap & close the call, Parameters for Performance
      • Comprehension Skills, Checkpoint 5 Topics Covered so far through Written & Viva, Handsets & generation of technology
      • Understanding pre paid & Post paid, Company Canvas, Building A service Attitude, Segmentation Of Customers, To record calls on selling
      • Process of selling & Interlinkages beween telephone ettiquette, listen & selling

      Course Name: Mobile VAS Business
      Course Duration: 10 Sessions, 21 Hours.
      Texts and Other Required Course Material: No external material required.
      Author: All course content will be developed by the instructor.

      Case Studies: No external material required. All in-classroom content including Case Studies will be provided by the instructor.

      Course Overview
      Objectives: To build understanding of the Mobile VAS business.
      Who Should Attend: Professionals in the Telecom / Digital Media industries who want to build /enhance their business understanding.
      Prerequisite: Basic understanding of telecom/datacom network.
      Course Structure and Administration:
      Course Evaluation: 3 quizzes, each with 10 question and multiple choice answers. One quiz at the end of 4th session, one at the end of 7th and a third at the end of the 10th. Cumulative score to be taken into account.
      Requirements: None

      Revenue Assurance

      Emerging Telecom Technologies

      Telecom Management

      Final Project

        

      Certification from TSSC (Telecom Sector Skill Council), a non-profit industry driven body set up under the aegis of the NSDC (National Skill Development Council), at the completion of the program.

      Telecom Network Management: Course Overview

      • This course has carefully been designed keeping in view of the need of a telecom professional in the contemporary telecom domain. The contents have carefully been woven to address the perspectives of both technical & business management professionals.  So, independent of whether a telecom professional of today is in technical management or in business management arena, opting for this course is a must.
      • To bring all readers in the same platform, this course starts with basic concepts like telecom network definition, basic aspects of setting up a non-trivial network.
      • It answers a few basic questions like why there is a need to manage a telecom network and if so, how to accomplish that.
      • Gradually, it takes the reader to the architecture of a Network Management System in its traditional form. The reader then travels to an arena to find a few key operational aspects.
      • This would give them an exposure on the deficiency of the traditional approach of network management.
      • The reader would be delighted to know how the scientific & technical world step in and bring the concepts of operational models like TOM and eTOM.
      • They would get an understanding of Operations Support System (OSS) and Business Support Systems (BSS) as well.

      Course Curriculum Topics:Telecom Network – A Definition.

      • Steps to setup a Telecom Network.
      • Need to Manage a Telecom Network.
      • How to manage a Telecom Network ?
      • Simple Network Management Protocol (SNMP).
      • Functional Definition of Network Management.
      • Traditional OA&M and Operation & Maintenance Centres (OMCs).
      • Telecommunication Management Network (TMN) Architecture.
      • Network Management System & Network Operation Centre (NOC).
      • Challenges faced by Incumbent Network Operators.
      • Operation Support System (OSS) – An Overview.
      • TOM model from TMForum.
      • eTOM model from TMForum.
      • A Few Applications from OSS repertoir.
      • Business Support System – An Overview. 

      intro on broadband

      Orientation Course

      Course Overview:

      This course covers the regulatory framework of Telecommunications sector to provide an understanding of how the Telecom sector is regulated with the objective to protect consumer interest, while allowing all other stakeholders to make adequate profits to run business operations. Various statutes that regulate telecom, the regulatory and telecom policy making agencies, the licensing of Telecommunication services and the role of regulatory and policy making agencies like the TRAI (Telecom Regulatory Authority of India),TDSAT( Telecom Disputes Settlement and Appellate Tribunal),DOT(Department of Telecom) and Telecom Commission are also discussed during the course.

      This course also includes a detailed study of various Licenses in Telecommunication sector, Tariff regulating mechanism, Quality of Service performance by Licensees, model Inteconnect agreement of Basic and Cellular Mobile Telephone service, TRAI act, Interconnect Agreement and USO (Universal service obligation)

      Course Overview:
      - Statutes that regulate Communication
      -
      Regulatory and Policy making agencies
      -
      Regulatory and Tariff setting Functions of TRAI
      -
      Licenses for Telecommunication services.
      -
      USO (Universal service obligation)
      -
      Telecom Numbering Regulation.

      Course Objective:

      To understand Regulatory aspects of Telecom Sector in terms of Licensing framework and its compliance.

      Who Should Attend:
      -
      Decision-makers and
      senior policy-makers
      - Managers and regulatory staff in telecom industry
      - Service providers
      - New and established operators and suppliers
      - Developer & Engineers in telecommunications
      - Researchers, Consultants and Students

      What Will You Learn:
      - Licenses in Telecommunication sector
      - Tariff regulating mechanism
      - Quality of Service performance by Licensees
      - Model Inteconnect agreement of Basic and Cellular Mobile Telephone service
      - TRAI Act
      - Interconnect Agreement and USO (Universal service obligation)

      Course Overview:

      This course offers a comprehensive introduction and is a guide to the main types of billing systems and services available in modern telecommunications. The outline encompasses the entire end-to-end billing processes, with major focus on the interfaces with OSSs, BSS, network management, finance, marketing, pricing and customer support.

      This course introduces the concept of Managed Services (MS) in telecom domain. Commencing the journey from non MS era, it takes the students and attendees through an interesting transformation path where MS sets in and becomes a part & parcel of the Telecom Industry today. It discusses the drivers of MS in the industry, the initial concerns of the Service Providers, and the critical strategy for choosing right MS Partner, MS Project Implementation & Governance during the life cycle of MS. It also describes MS CONTRACT and its importance for the health of MS.

      Certified In-Store Promoter

      In-Store Promoter in the telecom industry is also known as In-Shop Promoter / Sales Representative / Retail Sales Representative / Sales Executive.

      Brief Job Description: Individual at this job demonstrates and highlights the product FAB (Features, Advantages & Benefits) to walk-in customers; offers them the opportunity to touch and feel the product(s) on display; respond to queries on product and services.

      Personal Attributes: This job requires the individual to possess influencing and persuasion skills; excellent verbal and non-verbal communication skills; English & regional language proficiency; must be energetic and flexible and should have a pleasing personality.

      Course Overview: 

      • Knowledge of the Company/ Organization & Grooming
      • Technical Knowledge
      • Reading Skills
      • Professional Skills
      • Knowledge of the Company/ Organization
      • Communication Skills
      • Influencing Others
      • Active Listening skills
      • Customer Centricity
      • Technical skills
      • Writing Skills
      • Time Management

      Handset Repair Engineer (Level II)

      Brief Job Description: Handset repair engineer is responsible for performing handset repair including hardware and software components and testing the handset for adequacy post repair.

      Personal Attributes: This job requires the individual to be analytical and be able to handle high pressure situations to successfully perform the assigned responsibilities. He should have basic written and oral communication skills and should be able to apply practical judgement to successfully perform the assigned responsibilities.

      Course Overview:

      Mobile Phone Introduction

      1. Introduction of Mobile Phone
      2. Measuring Tool
      3. Repair Tools & Equipments
      4. ESD Safety
      5. Mobile Components I
      6. Mobile Components II
      7. Disassembling/Assembling

          L2 Level Hardware Faults and Resolution

      1. Keyboard Related Faults
      2. Display Related Faults
      3. Audio Related Faults
      4. Network/SIM Related
      5. Vibrator Related Fault
      6. Camera Related Faults
      7. Battery/Charging Related faults
      8. Buttons Related Faults
      9. FM / Bluetooth related faults
      10. Handfree / Memory Card Related Faults
      11. Handset Dead
      12. Touchpad Related

      Handset fault Rectification with Software

      1. Basic Computer Understanding
      2. Handset Operating Sytem and applications
      3. Software Zigs
      4. Software Related faults
      5. Software Practise
      6. Data Backup and Restore

      Spares Management and Inventory

      1. Spares Management
      2. Inventory Management (MSL)

             Standard Repair Process

      1. Troubleshooting
      2. Repair Process

      General Skill & Communication Skills

      1. Develop General Skill
      2. Communication Skill
      3. Writing Skill
      4. Reading Skill
      5. Analytical Skill


      KPI Parameters and Reports

      1. KPI Parameters & Reports

      Revision

      1. Hardware
      2. Software

      Certified Tower Technician

      Tower Technician in the telecom industry is also known as a Site Engineer/Tower Engineer/Site Technician

      Brief Job Description: Individual in this role is responsible to maintain site live 24x7, maintain and repair level-1 faults/issues at telecom tower site, level-1 preventive and corrective maintenance and report faults to the supervisor in time. Individual also needs to travel inter-state and work during odd hours, when required.

      Personal Attributes: This job requires the individual to be technically qualified; self-disciplined; assertive; team player; action-orientated; possess analytical skills & problem solving ability; effective communication skills and ability to work under pressure.

      Course Content: 

      • Organization Knowledge
      • Technical Knowledge
      • Comprehension Skills
      • Planning and Execution Skills
      • Relationship Building
      • Organization Knowledge and its process and responsibilities
      • Reading Skills
      • Analytical Skills
      • Planning and Execution
      • Organizational Context and Technical Knowledge
      • Professional Skills
      • Technical Skills
      • Core Skills/ Generic Skills: Reading Skills
      • Oral and Communication Skills

      Certificated Optical Fiber Splicer

      Optical fibre splicer is responsible for ensuring efficient splicing of the optical fibre cables and supports in optical fibre installation and in carrying out fibre testing using OTDR and power meter.

      Certificated Optical Fibre Splicer Course is based on NOS (National Occupation Standard) and accredited by TSSC. It is designed for the field Optical Fiber Splicer who wants to learn the practical aspects of cable splicing. It contains a wealth of information and technical specifications for fibre optic cables. This course combines both theory and practical exercises with hands-on activities.

      Course Assessment and Certification:

      Final certification will be done by TSSC based on rigorous assessment.

      Course Overview:

      • Fiber Optics Basics: Principles of optical transport media and OFC communication
      • Optical fiber characteristics like refraction, polarization, attenuation, dispersion
      • Bands in optical fibre and their usability, loss characteristics
      • Signal strength and quality KPIs – design values and margins
      • Fiber Optic Cables
      • Functionality of optical equipments like cleaver, mechanical and fusion splicing kit, protection sleeves, fiber stripper, fiber reinforced plaster during splicing and jointing
      • Functionality of optical test equipments like OTDR and power meter
      • Optimal values of OTDR, Power meter and light meter test results
      • Fiber Optics Safety
      • Fiber Cable Preparation and Mid-Span Access
      • Fusion & Mechanical Splicing Process
      • Different types of OFC connectors based on the type of equipments
      • Splice Troubleshooting
      • Hands-on Session Begins
      • Hands-on Session Continues
      • Standard process and need for performing duct integrity tests like air tightness tests and kink free tests
      • OSP Enclosures and Installation
      • Hands-on Troubleshooting
      • Installation & Commissioning of Optical fiber cables (OFC)
      • Standard trenching, cable laying, pit preparation, splicing, jointing, blowing and back-filling process for installation of OFC cables