Module-4.pptxdsgaergdvdfregxvzsdgeresdfxcdz

prakashkg059pkg 2 views 17 slides Sep 09, 2025
Slide 1
Slide 1 of 17
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17

About This Presentation

xcvdryrtzxacety5uh,bv bxs


Slide Content

I N T R O D U C T I O N T O A I A N D A P P L I C A T I O N S 1BAIA103/203 DR. ABHISHEK S. RAO Associate Professor, Dept. of IS&E Canara Engineering College, Mangaluru.

8.1 AI and Ethical Concerns What’s in the syllabus: Ethical Use of AI → ensuring fairness, transparency, accountability in AI systems. Is AI Dangerous? Will Robots Take Over? → myths vs reality, AI as a tool vs fear of superintelligence. Ethics in AI → principles: right, fair, just outcomes (Microsoft Tay bot failure). AI and Bias → biased training data → biased results → unfair outcomes. Towards Ethical & Trustworthy AI → unbiased datasets, explainable AI, human oversight. Why is Ethical AI Important? → prevents harm, builds public trust, supports long-term adoption. Impact of AI on Jobs → automation replacing repetitive jobs but creating new opportunities (AI maintenance, data roles). How can you explain in class: Start with a real case: Microsoft Tay chatbot (2016) → became racist & abusive in <24 hours. Show contrast: AI in healthcare saving lives vs AI in recruitment showing bias. Ask students: “Would you trust an AI system to decide who gets a loan or a job?” → sparks discussion on fairness. Use simple analogy: AI is like a mirror → reflects the data it is trained on (good or bad). End with debate/discussion: “Will AI replace jobs or create better ones?”

Microsoft Tay (2016) – Case Study What it was: Tay (“Thinking About You”) was a Twitter-based chatbot launched by Microsoft in March 2016 . It was designed to mimic the language patterns of a teenage girl and learn from conversations with users. What happened: Within 16 hours of launch , Tay began tweeting highly offensive, racist, and inappropriate content. This happened because trolls deliberately flooded it with toxic language and Tay “learned” to mimic them without safeguards. Why it failed: Lack of content filtering – No strong moderation against offensive inputs. Reinforcement from bad data – It learned from malicious users. Unclear guardrails – No ethical boundaries or role constraints were set. Aftermath: Microsoft took Tay offline within a day and later issued an apology. The case became a classic lesson in AI ethics, safety, and bias control .

8.2 AI as a Service ( AIaaS ) What’s in the syllabus: Factors Triggering Growth of AIaaS → cloud computing, big data, cheaper hardware, demand for AI without high setup cost. The Growth of AIaaS → rapid adoption by businesses (healthcare, finance, retail), flexibility of pay-as-you-use models. Challenges of AIaaS → data privacy, vendor lock-in, hidden costs, limited customization. Vendors of AIaaS → Amazon AWS AI, Microsoft Azure AI, Google Cloud AI, IBM Watson, Oracle AI. How can you explain in class: Start with an analogy: “Just like we use Netflix as a service instead of building our own cinema, companies use AIaaS instead of building AI from scratch.” Show examples: AWS Rekognition (face detection), Google AutoML (no-code ML), IBM Watson (healthcare). Quick activity: Ask students → “If you were starting a startup, would you build your AI or use AIaaS ? Why?” Highlight risks: dependency on vendor + security concerns (real-world challenge). Wrap with discussion: “ AIaaS makes AI accessible to everyone — but do we lose control?”

8.4 Recent Trends in AI What’s in the syllabus: Collaborative Systems → AI systems that work with humans (co-bots, decision-support systems). Machines Assisting Humans → AI helping in surgery, disaster management, education, etc. Algorithmic Game Theory & Computational Social Choice → AI for fair decision-making, voting systems, resource allocation. Multi-Agent Reinforcement Learning (MARL) → multiple AI agents learning and interacting in shared environments (e.g., self-driving cars coordination, swarm robotics). Neuromorphic Computing → brain-inspired chips that process info like neurons, more efficient than traditional processors. How can you explain in class: Begin with a simple analogy: “Earlier, machines replaced humans. Now, machines are starting to collaborate with us.” Show a case: Collaborative robots in manufacturing or AI in education as a teaching assistant. Use a short demo/example: AlphaGo (game theory), drone swarms (multi-agent learning). Relate neuromorphic computing to the human brain: “Imagine chips that think like neurons — less power, more intelligence.” Quick class question: “Do you think future AI will work with humans or replace them?”

9.1 Expert Systems What’s in the syllabus: Popular Examples → MYCIN (medical), DENDRAL (chemistry), XCON (computer configuration). Characteristics → rule-based, reasoning ability, explanation facility, domain-specific knowledge. Components → knowledge base, inference engine, user interface. Participants → domain experts, knowledge engineers, end-users, system developers. Capabilities → problem-solving, reasoning, explanation, prediction. Advantages → consistency, availability 24×7, handles complex problems. Limitations → lack of creativity, expensive to build, limited to defined knowledge. Applications → medicine, engineering, customer support, agriculture. Technology → rule-based systems, frames, semantic networks. Development → identify problem, knowledge acquisition, knowledge representation, testing & refinement. How can you explain in class: Begin with a simple definition: “An expert system is like putting a human expert’s brain into a computer.” Show a real-world case: MYCIN diagnosing infections, or modern-day medical AI assistants. Draw a simple diagram on the board: Knowledge Base + Inference Engine + User Interface. Connect to students: “Think of Google Maps suggesting a route — it’s like an expert system for navigation.” Quick discussion: “Can expert systems replace doctors or teachers? Or just assist them?”

MYCIN (Medical Diagnosis, 1970s) Domain : Medicine – bacterial infections & recommending antibiotics. Approach : Rule-based expert system (~450 rules). Function : Asked doctors a series of yes/no and multiple-choice questions about patient symptoms, test results, etc. → Suggested diagnosis and antibiotic treatments. Strengths : Showed how expert knowledge could be encoded into rules. Limitations : Never widely deployed (legal/ethical issues of trusting machine-generated diagnoses).

DENDRAL (Chemistry, 1960s) Domain : Organic chemistry – mass spectrometry analysis. Approach : Rule-based expert system with heuristics. Function : Interpreted mass spectrometry data to hypothesize possible molecular structures. Strengths : One of the first AI systems to outperform human experts in a narrow domain. Impact : Proved AI could solve complex, specialized problems. XCON (a.k.a. R1, 1980s) Domain: Computer system configuration (Digital Equipment Corporation, DEC). Approach: Rule-based system (~2,500 rules at its peak). Function: Automatically configure orders of Virtual Address eXtension (VAX) computer systems (which components fit together, wiring, power supply, etc.). Strengths: Saved DEC tens of millions of dollars per year by reducing human errors in configuration. Impact: Considered one of the most successful expert systems in industry.

9.2 Internet of Things (IoT) What’s in the syllabus: Applications → smart homes, smart cities, healthcare monitoring, agriculture (smart irrigation), industrial automation. IoT Products → smartwatches, smart TVs, Alexa/Google Home, connected cars, fitness trackers. Challenges → data privacy & security, high cost of devices, connectivity issues, standardization problems. Sensors → temperature, motion, pressure, proximity, biometric, GPS core components that collect data. How can you explain in class: Start with a relatable example: “Your smartwatch tracking your steps and sending data to your phone is IoT in action.” Show images/video of IoT devices: Alexa controlling lights, farmers using soil sensors. Connect to students: “How many of you are already using IoT? Smart TVs? Fitness bands?” Quick discussion: “If IoT devices are always listening/monitoring, what are the risks?”

9.3 Artificial Intelligence of Things ( AIoT ) What’s in the syllabus: How Does AIoT Work? → IoT devices collect data via sensors → AI analyzes this data → enables real-time decisions (predictive maintenance in factories). Where Does AI Unlock IoT? → adds intelligence to raw IoT data → anomaly detection, pattern recognition, automation. Applications & Examples → smart homes (voice assistants), healthcare (wearables detecting diseases), autonomous vehicles, smart factories. Benefits & Challenges → benefits: efficiency, automation, predictive insights; challenges: data privacy, infrastructure costs, integration complexity. Future of AIoT → edge AI for faster processing, 5G-enabled IoT, AI-powered smart cities, personalized healthcare. How can you explain in class: Begin with a simple analogy: “IoT is like the body’s senses, AI is like the brain. Together, they make smarter decisions.” Show example video: self-driving car (IoT sensors + AI decision-making). Connect locally: “Hospitals in India are experimenting with AIoT for patient monitoring, like AI-based ECG analysis through wearables.” Quick discussion: “If your home appliances learn your habits, is it convenience or invasion of privacy?”

T h a n k Y o u M o d u l e 4 - C o m p l e t e d