Emerging Technologies Unveiled: AI, ML, Blockchain & More

  • Artificial Intelligence
    • Definitions and History:
      • What is AI? (Mimicking human intelligence, problem-solving, etc.)
      • Who coined the term “Artificial Intelligence” (John McCarthy)?
      • Turing Test (Purpose).

  • Types of AI:
    • Strong AI vs. Weak AI (Narrow AI): Understanding the fundamental difference (human-level vs. task-specific).
    • Reactive Machines, Limited Memory, Theory of Mind, Self-Aware AI (conceptual differentiation).

  • Core AI Branches/Components:
    • Machine Learning: What it is in relation to AI (subset).
    • Natural Language Processing (NLP): Its goal (understanding/generating human language).
    • Computer Vision (CV): Its goal (interpreting visual data).
    • Robotics (basic idea).

  • Problem Solving & Search:
    • Informed vs. Uninformed Search: A* search, BFS, DFS (basic understanding of when to use which).
    • Heuristics.

  • AI Agents:
    • Basic components of an intelligent agent (Percepts, Actions, Environment).

  • Ethical Considerations (Conceptual):
    • Bias, fairness, privacy (awareness of these issues).

  • Common AI Languages (Historical/Prominent): LISP, Prolog, Python.

  • Natural Language Processing
    • Fundamental Concepts:
      • Tokenization: Splitting text into words/units.
      • Stemming vs. Lemmatization: Differences and purpose (text normalization).
      • Stop Words: What they are and why removed.

  • Text Representation:
    • Bag-of-Words (BoW): Basic concept.
    • TF-IDF: Term Frequency-Inverse Document Frequency (purpose).
    • Word Embeddings: Concept of representing words as vectors, capturing semantic similarity (e.g., Word2Vec, GloVe, BERT – focus on their function).

  • Core NLP Tasks:
    • Part-of-Speech (POS) Tagging: Assigning grammatical tags.
    • Named Entity Recognition (NER): Identifying specific entities (persons, locations, organizations).
    • Sentiment Analysis: Determining emotional tone.
    • Machine Translation: Converting language.
    • Text Summarization: Condensing text.

  • Architectures (Conceptual):
    • RNNs/LSTMs: Used for sequential data.
    • Transformers & Attention: Revolutionary for NLP (understanding the concept of attention).

  • Ambiguity in NLP: Lexical, syntactic, semantic ambiguity (basic awareness).

  • Machine Learning
    • Core ML Paradigms:
      • Supervised Learning: Definition, examples (Regression, Classification).
      • Unsupervised Learning: Definition, examples (Clustering, Dimensionality Reduction).
      • Reinforcement Learning: Definition (agent-environment interaction, rewards).

  • Key Terminology:
    • Features/Attributes: Input variables.
    • Labels/Targets: Output variables.
    • Training Data, Test Data, Validation Data: Their roles.
    • Overfitting & Underfitting: How to identify, basic mitigation.
    • Bias-Variance Trade-off: Conceptual understanding.
    • Hyperparameters: Tunable parameters.

  • Common Algorithms (Focus on how they work at a high level and their primary use case):
    • Regression: Linear Regression (basic equation).
    • Classification:
      • Logistic Regression (binary classification).
      • Decision Trees (tree structure, simple rules).
      • K-Nearest Neighbors (KNN) (instance-based, lazy learner).
      • Support Vector Machines (SVM) (finding optimal hyperplane).
      • Naive Bayes (probabilistic, based on Bayes’ theorem).
    • Clustering: K-Means (grouping data into ‘k’ clusters).
    • Dimensionality Reduction: Principal Component Analysis (PCA) (reducing features).

  • Model Evaluation Metrics (Crucial for MCQs):
    • Classification: Accuracy, Precision, Recall, F1-Score (definitions and what they measure), Confusion Matrix (components).
    • Regression: Mean Squared Error (MSE), Root Mean Squared Error (RMSE).
    • Cross-Validation: Purpose (robust evaluation).

  • Deep Learning Basics (as part of ML):
    • Neural Networks: Basic structure, Activation Functions (purpose).
    • Backpropagation (conceptual): How weights are updated.
    • CNNs: For images.
    • RNNs: For sequences.

  • Blockchain Technology
    • Fundamental Principles:
      • Decentralization: No central authority.
      • Immutability: Data cannot be altered after recording.
      • Transparency: Transactions are visible to participants.
      • Cryptographic Hashing: Securing blocks, creating links.
    • Distributed Ledger Technology (DLT): The underlying concept.
    • Core Components:
      • Blocks: Data, timestamp, hash of previous block.
      • Chain: How blocks are linked.
      • Nodes: Participants in the network.

  • Consensus Mechanisms (Key for MCQs):
    • Proof of Work (PoW): How it works (mining, computational puzzle), key characteristics (energy consumption).
    • Proof of Stake (PoS): How it works (staking), advantages over PoW.

  • Smart Contracts:
    • Definition: Self-executing agreements.
    • Decentralized Applications (DApps): Applications built on smart contracts.
    • Ethereum as the platform for smart contracts.

  • Types of Blockchains:
    • Public Blockchain: (e.g., Bitcoin, Ethereum) – open, permissionless.
    • Private/Permissioned Blockchain: (e.g., Hyperledger Fabric) – restricted access.
    • Consortium Blockchain (basic idea).

  • Key Use Cases:
    • Cryptocurrencies (Bitcoin, Ethereum).
    • Supply Chain.
    • Digital Identity.
    • Decentralized Finance (DeFi).
    • Non-Fungible Tokens (NFTs).

  • Challenges: Scalability, 51% attack.