Unit I - iNTRODUCTION TO Measurement Systems.pptx

MalathyN5 29 views 101 slides Sep 16, 2024
Slide 1
Slide 1 of 101
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52
Slide 53
53
Slide 54
54
Slide 55
55
Slide 56
56
Slide 57
57
Slide 58
58
Slide 59
59
Slide 60
60
Slide 61
61
Slide 62
62
Slide 63
63
Slide 64
64
Slide 65
65
Slide 66
66
Slide 67
67
Slide 68
68
Slide 69
69
Slide 70
70
Slide 71
71
Slide 72
72
Slide 73
73
Slide 74
74
Slide 75
75
Slide 76
76
Slide 77
77
Slide 78
78
Slide 79
79
Slide 80
80
Slide 81
81
Slide 82
82
Slide 83
83
Slide 84
84
Slide 85
85
Slide 86
86
Slide 87
87
Slide 88
88
Slide 89
89
Slide 90
90
Slide 91
91
Slide 92
92
Slide 93
93
Slide 94
94
Slide 95
95
Slide 96
96
Slide 97
97
Slide 98
98
Slide 99
99
Slide 100
100
Slide 101
101

About This Presentation

MEASUREMENT AND INSTRUMENTATION


Slide Content

22BMT301 Biomedical Sensors and Transducers By Mr. M. MURALI Assistant Professor / BME J.N.N Institute of Engineering

22BMT301 BIOMEDICAL SENSORS AND TRANSDUCERS UNIT I MEASUREMENT SYSTEM 9 Measurement System – Instrumentation - Classification and Characteristics of Transducers - Static and Dynamic - Errors in Measurements and their statistical analysis- methods of error analysis,-uncertainty analysis-expression of uncertainty: accuracy and precision index, propagation of errors– Calibration (NABL-17025) - Primary and secondary standards. UNIT II DISPLACEMENT, PRESSURE, TEMPERATURE SENSORS 9 Strain Gauge: Gauge factor, sensing elements, configuration, and unbounded strain gage. Capacitive transducer - various arrangements, Inductive transducer, LVDT, Passive types: RTD materials & range, relative resistance vs. temperature characteristics, thermistor characteristics, Active type: Thermocouple - characteristics.   UNIT III PHOTOELECTRIC AND PIEZO ELECTRIC SENSORS 9 Phototube, scintillation counter, photo multiplier tube (PMT), photovoltaic, photo conductive cells, photo diodes, phototransistor, comparison of photoelectric transducers. Optical displacement sensors and optical encoders. Piezoelectric active transducer- Equivalent circuit and its characteristics . 2

UNIT IV SIGNAL CONDITIONING CIRCUITS AND METERS 9 Functions of signal conditioning circuits, Preamplifiers, Concepts of passive filters, Impedance matching circuits, AC and DC Bridges - wheat stone bridge, Kelvin, Maxwell, Hay, Schering, Q-meter, PMMC, MI and dynamometer type instruments - DC potentiometer- Digital voltmeter – Multi meter. UNIT-5 ADVANCES IN SENSING TECHNOLOGIES 9 Biosensors: Classification of biosensors, Immobilization of Bio receptor. Biocatalysts based biosensor: Principle, Construction and operation. Glucose Biosensor, Microbe Biosensor, Electrochemical Biosensor, Smart Sensors: Salient features, Architecture & Applications. TEXT BOOK(S) A.K.Sawhney , “Electrical & Electronics Measurement and Instrumentation”,10th edition, Dhanpat Rai & Co, New Delhi, 19th Revised edition 2011, Reprint 2014. John G. Webster, “Medical Instrumentation Application and Design”, 4th edition, Wiley India Pvt Ltd, New Delhi, 2015 3

Measurements Measurement of a given quantity is essentially an act or result of comparison between the quantity (whose magnitude is unknown) and predetermined or predefined standards. Two quantities are compared the result is expressed in numerical values. 6 UNIT I FUNDAMENTALS OF MEASUREMENTS

Basic requirements for a meaningful measurement The standard used for comparison purposes must be accurately defined and should be commonly accepted. The apparatus used and the method adopted must be provable ( verifiable) . 7

Significance Of Measurement Importance of Measurement is simply and eloquently expressed in the following statement of famous physicist Lord Kelvin: ” I often say that when you can measure what you are speaking about and can express it in numbers, you know something about it; when you cannot express in it numbers your knowledge is of meager and unsatisfactory kind ” 8

Two major functions of all branch of engineering Design of equipment and processes Proper Operation and maintenance of equipment and processes. 9

Methods of Measurement Direct Methods Indirect Methods 10

DIRECT METHODS : In these methods, the unknown quantity (called the measurand ) is directly compared against a standard. INDIRECT METHOD : Measurements by direct methods are not always possible, feasible and practicable. In engineering applications measurement systems are used which require need of indirect method for measurement purposes. 11

Instruments and Measurement Systems Measurement involve the use of instruments as a physical means of determining quantities or variables. Because of modular nature of the elements within it, it is common to refer the measuring instrument as a MEASUREMENT SYSTEM. 12

Evolution of Instruments Mechanical Electrical Electronic Instruments. MECHANICAL: These instruments are very reliable for static and stable conditions. But their disadvantage is that they are unable to respond rapidly to measurements of dynamic and transient conditions. 13

Contd ELECTRICAL: It is faster than mechanical, indicating the output are rapid than mechanical methods. But it depends on the mechanical movement of the meters. The response is 0.5 to 24 seconds. ELECTRONIC: It is more reliable than other system. It uses semiconductor devices and weak signal can also be detected. 14

Classification Of Instruments Absolute Instruments. Secondary Instruments. ABSOLUTE: These instruments give the magnitude if the quantity under measurement terms of physical constants of the instrument. SECONDARY: These instruments are calibrated by the comparison with absolute instruments which have already been calibrated. 15

Further it is classified as Deflection Type Instruments Null Type Instruments. 16

Functions of instrument and measuring system can be classified into three. They are: i ) Indicating function. ii) Recording function. iii) Controlling function. Application of measurement systems are: i ) Monitoring of process and operation. ii) Control of processes and operation. iii) Experimental engineering analysis. 17

Types Of Instrumentation System Intelligent Instrumentation (data has been refined for the purpose of presentation ) Dumb Instrumentation (data must be processed by the observer) 18

Functional Elements of an Instrumentation System 19 PRIMARY SENSING ELEMENT VARIABLE CONVER -SION ELEMENT VARIABLE MANIPULATI-ON ELEMENT DATA TRANSMISSIO-N ELEMENT DATA CONDITIONING ELEMENT INTERMEDIATE STAGE DETECTOR TRANSDUCER STAGE TERMINATING STAGE QUANTITY TO BE MEASURED DATA PRESENTA TION ELEMENT

Elements of Generalized Measurement System Primary sensing element. Variable conversion element. Data presentation element. PRIMARY SENSING ELEMENT: The quantity under measurement makes its first contact with the primary sensing element of a measurement system. VARIABLE CONVERSION ELEMENT: It converts the output of the primary sensing element into suitable form to preserve the information content of the original signal. 20

Contd.. DATA PRESENTATION ELEMENT: The information about the quantity under measurement has to be conveyed to the personnel handling the instrument or the system for monitoring, control or analysis purpose. 21

Performance Characteristics Static Characteristics Dynamic Characteristics 22

Static Characteristics Application involved measurement of quantity that are either constant or varies slowly with time is known as static. A ccuracy Precision S ensitivity Error Resolution Threshold Linearity Zero D rift 23

Static Characteristics R eproducibility Stability Tolerance Range or Span Bias Hysteresis D ead Zone Span drift 24

ACCURACY: It is the closeness with an instrument reading approaches the true value of the quantity being measured. TRUE VALUE: True value of quantity may be defined as the average of an infinite no. of measured value. SENSITIVITY is defined as the ratio of the magnitude of the output response to that of input response. 25

ERROR: It is defined as the difference between the measured value and true value of the quantity. A=Am-At Where Am =measured value of quantity At =true value of quantity. It is also called as the absolute static error. 26

Static Characteristics of Instruments 6.Resolution: The smallest measurable input change of an instrument. For example, the resolution of a four-digit voltmeter with a range of 999.9V is 0.1V, whereas for a five-digit voltmeter of the same range, the resolution would be 0.01V.

7.Repeatability: Repeatability describes the closeness of output readings when the same input is applied repetitively over a short period of time, with the same measurement conditions, same instrument and observer, same location and same conditions of use maintained throughout. 8.Reproducibility: Reproducibility describes the degree of closeness of output readings for the same input when there are changes in the method of measurement, observer, measuring instrument, location, conditions of use and time of measurement.

9.Hysteresis: Hysteresis is a phenomenon that depicts different output effects when loading and unloading an instrument. Hysteresis is observed in any physical, chemical, mechanical or electrical curves. The non-coincidence of output when the input is increased and then decreased is on account of internal friction or hysteretic damping.

10.Range: The range of an instrument is defined as the minimum and maximum values between which the instrument can provide output values. For example, a voltmeter specifies as (0-30)V can provide output between 0V and 30V. Hence, the range of voltmeter is 0V and 30V. Whereas, a voltmeter specified as (10-30)V can provide output between 10V and 30V. Hence the range of the voltmeter is 10Vto 30V. Range = Minimum value to Maximum value

Static Characteristics of Instruments 11.Span: The span of an instrument is defined as the difference between the maximum and minimum values of the instrument. For example, a voltmeter specified as(0-30)V has a span of 30V. Whereas, a voltmeter specified as (10-30)V has a span of 20V. Span = Maximum value – Minimum value

Static Characteristics of Instruments 12.Input Impedance: The input impedance of an electrical network is the measure of the opposition to current (impedance), both static (resistance) and dynamic (reactance), into the load network that is external to the electrical source. 13.Loading Effect: Loading effect can be defined as the effect on the source by the load impedance. Usually loading effect reduces the voltage level of a voltage source.

16.Stability: Stability refers to the ability of an instrument to maintain a certain physical property at a constant value, while rejecting any perturbations in the environment. A stable instrument ensures that results are repeatable and reproducible. 17.Tolerance: Tolerance refers to the total allowable error within an item. This is typically represented as a +/- value off of a nominal specification. Products can become deformed due to changes in temperature and humidity, which lead to material expansion and contraction, or due to improper feedback from a process control device.

Static Characteristics of Instruments 18.Dead Space: Dead space is defined as the range of different input values over which there is no change in output value. Any instrument that exhibits hysteresis also displays dead space . Some instruments that do not suffer from any significant hysteresis can still exhibit a dead space in their output characteristics, however.

Noise A spurious current or voltage extraneous to the current or voltage of interest in an electrical or electronic circuit is called noise. 35

Noise 36 Generated Noise Conducted Noise Radiated Noise

Dynamic Characteristics Speed of response Measuring lag Fidelity Dynamic error 37

Dynamic Characteristics SPEED OF RESPONSE :It is defined as the rapidity with which a measurement system responds to changes in measured quantity. FIDELITY: It is defined as the degree to which a measurement system indicates changes in the measured quantity without any dynamic error. 38

Dynamic Error It is the difference between the true value of the quantity changing with time and the value indicated by the measurement system if no static error is assumed. It is also called measurement error. 39

Measuring Lag It is the retardation delay in the response of a measurement system to changes in the measured quantity. It is of 2 types: Retardation type: The response begins immediately after a change in measured quantity has occurred. Time delay: The response of the measurement system begins after a dead zone after the application of the input. 40

Errors in Measurement Limiting Errors (Guarantee Errors) Known Error Classification 41 Gross Error Systematic or Cumulative Error Random or Residual or Accidental Error Instrumental Environmental Observational

Limiting Error In most instruments the accuracy is guaranteed to be within a certain percentage of full scale reading. The limits of these deviations from the specified value are defined as Limiting Errors or Guarantee Errors. Known Error When the error of a quantity or an instrument is known the effect of this error, when combined with other errors, can be computed in a manner similar to the combinations of limiting errors. But the difference is that in case of known errors the signs of relative errors are given and must be preserved in the calculations. 42

Gross Error Human Mistakes in reading , recording and calculating measurement results. The experimenter may grossly misread the scale. E.g.: Due to oversight instead of 21.5 o C, they may read as 31.5 o C They may transpose the reading while recording (like reading 25.8 o C and record as 28.5 o C) 43

Systematic Errors INSTRUMENTAL ERROR: These errors arise due to 3 reasons- Due to inherent short comings in the instrument Due to misuse of the instrument Due to loading effects of the instrument ENVIRONMENTAL ERROR: These errors are due to conditions external to the measuring device. These may be effects of temperature, pressure, humidity, dust or of external electrostatic or magnetic field. OBSERVATIONAL ERROR: The error on account of parallax is the observational error. 44

Residual error These errors are due to a multitude of small factors which change or fluctuate from one measurement to another. The happenings or disturbances about which we are unaware are lumped together and called “Random” or “Residual”. Hence the errors caused by these are called random or residual errors. 45

Arithmetic Mean The most probable value of measured variable is the arithmetic mean of the number of readings taken. x1,x2,.. x3= readings of samples n= number of readings 46 Statistical Analysis

Deviation Deviation is departure of the observed reading from the arithmetic mean of the group of readings. 47

Standard Deviation The standard deviation of an infinite number of data is defined as the square root of the sum of the individual deviations squared divided by the number of readings. 48

Variance 49

Probable Error Probable error of one reading(r 1 )=0.6745s Probable error of mean (r m ) 50

Problem Question: The following 10 observation were recorded when measuring a voltage: 41.7,42.0,41.8,42.0,42.1, 41.9,42.0,41.9,42.5,41.8 volts. Find Mean Standard Deviation Probable Error Range. 51

Answer Mean=41.97 volt S.D=0.22 volt Probable error=0.15 volt Range=0.8 volt. 52

Uncertainty Analysis Many a times the data available is a single sample data and therefore the statistical methods discussed earlier cannot be applied directly. On account of the single sample nature of the data, it is not possible to observe their scatter by plotting a frequency distribution curve. Hence, it becomes essential to modify our approach. Kline and McClintock have proposed a method based upon probability and statistics which analyses the data employing uncertainty distribution rather than frequency distribution. They have defined the uncertainty distribution as the error distribution the experimenter believes would exist if the situation permits multi-sampling. 53

Uncertainty Analysis Kline and McClintock suggest that a single sample result may be expressed in terms of a mean value and an uncertainty interval based upon stated odds The result may be written as follows : U=Ū±w (b to 1) EXAMPLE The results of a temperature measurement may be expressed as  =100°C±l°C This means that there is an uncertainty of ±1°C in the result.  =1OO°C±1°C (20 to 1) Now the results expressed in the above form become more specific in nature. This is because the experimenter is willing to bet 20 to 1 odds that the temperature measurement which he has made are within ±1°C of 100°C 54

Transducer Transducer is a device that converts one type of energy into another type of energy for the purpose of measurement or transfer of information. 55

Difference Between a Sensor and a Transducer

Difference Between a Sensor and a Transducer

Classification of Transducer 1. Based on the physical phenomenon ( a) Primary transducer ( b) Secondary transducer 2. Based on the power type ( a) Active transducer ( b) Passive transducer 3. Based on the type of output ( a) Analog transducer ( b) Digital transducer 58

Classification of Transducer 4. Based on the electrical phenomenon ( a) Resistive transducer ( b) Capacitive transducer ( c) Inductive transducer ( d) Photoelectric transducer ( e) Photovoltaic transducer 5. Based on the non-electrical phenomenon ( a) Linear displacement ( b) Rotary displacement 6. Based on the transduction phenomenon, ( a) Transducer ( b) Inverse transducer. 59

Calibration (NABL) Calibration is an essential process to be undertaken for each instrument and measuring system frequently. It is the process where the test instrument is compared with the standard instrument . It consists of reading the standard and test instruments simultaneously when the input quantity is held constant at several values over the range of the test instrument. The calibration is better carried out under the stipulated environmental conditions . Certification of an instrument manufactured by an industry is undertaken by the National Physical Laboratory and other laboratories having secondary and working standards. 60

Calibration ( NABL ) National Accreditation Board for Testing and Calibration Laboratories ( NABL ) ISO/IEC 17025:2017 is an international standard which sets outs the general requirements for the competence of testing & calibration laboratories. The purpose of this document is to enable the laboratories about the NABL’s Policies for accreditation as per ISO/IEC 17025:2017. General Requirements for the Competence of Testing and Calibration Laboratories The accreditation services to Testing Laboratories is currently given in the following disciplines: ❖ Chemical ❖ Biological ❖ Mechanical ❖ Electrical ❖ Electronics ❖ Fluid Flow ❖ Forensic ❖ Non-Destructive (NDT) ❖ Photometry ❖ Radiological ❖ Diagnostic Radiology QA Testing ❖ Software & IT System 61

Technical Requirements Of ISO/IEC 17025:2017 ( NABL ) Certification Following are the specialized necessities of ISO/IEC 17025:2017 certification that research center needs to distinguish amid usage of framework Assign one individual as Quality Manager for setting up and observing of compelling usage of the administration frameworks necessities Assign one individual as Technical Manager for setting up and observing of compelling usage of the specialized necessities Test/adjustment parameter, its scope of testing, farthest point of location and Uncertainty of Measurement, Count of Uncertainty of Measurement of each test/adjustment parameters Investigation of re– test or re– adjustment results factually 62

Technical Requirements Of ISO/IEC 17025:2017 ( NABL ) Certification Benefits of NABL Certification as per ISO/IEC 17025:2017 Certification System normal acknowledgment of ability of a research center by an Certification body as per global criteria has numerous preferences: Potential increment in business because of improved client certainty and fulfilment. Savings as far as time and cash because of decrease or end of the requirement for re-testing of items. Better control of lab activities and input to research centers with respect to whether they have sound Quality Assurance System and are actually capable Increase of trust in Testing/Calibration information and staff performing work. Customers can seek and distinguish the research facilities licensed by NABL for their particular prerequisites from the Directory of Accredited Laboratories Users of authorize research facilities will appreciate more noteworthy access for their items, in both household and universal markets, when tried by certify labs 63

Process Of NABL Certification Stage I Prepare your laboratory's application for NABL certification, giving all desired information and enlisting the test(s) / calibration(s) along with range and measurement uncertainty for which the laboratory has the competence to perform. Laboratory can apply either for all or part of their testing/ calibration facilities. Formats NABL 151, NABL 152 & NABL 153 are to be used by Testing, Calibration & Medical Laboratories respectively for applying to NABL for certification. Laboratory has to take special care in filling the scope of certification for which the laboratory wishes to apply. Incase , the laboratory finds any clause (in part or full) not applicable to the laboratory, it shall furnish the reasons Laboratories are required to submit three sets of duly filled in application forms for each field of testing/calibration along with two sets of Quality Manual and Application Fees 64

Process Of NABL Certification Stage I NABL Secretariat on receipt of application will issue acknowledgement to the laboratory. After scrutiny of application for it being complete in all respects, a unique Customer Registration Number will be allocated to laboratory for further processing of application. NABL Secretariat shall then nominate a Lead Assessor for giving Adequacy Report on the Quality Manual / Application submitted by the laboratory. A copy of Adequacy Report by Lead Assessor will be provided to Laboratory for taking necessary corrective action, if any. The laboratory shall submit Corrective Action Report After satisfactory corrective action by the laboratory, a Pre-Assessment audit of the laboratory will be organized by NABL. Laboratories must ensure their preparedness by carrying out its internal audit before Pre-Assessment 65

Process Of NABL Certification Stage II NABL Secretariat shall organize the Pre-Assessment audit, which shall normally be carried by Lead Assessor at the laboratory sites The pre-assessment helps the laboratory to be better prepared for the Final Assessment. It also helps the Lead Assessor to assess the preparedness of the laboratory to undergo Final Assessment apart from Technical Assessor(s) and Total Assessment Man-days required vis-à-vis the scope of certification as per application submitted by the laboratory. A copy of Pre-Assessment Report will be provided to Laboratory for taking necessary corrective action on the concerns raised during audit, if any The laboratory shall submit Corrective Action Report to NABL Secretariat. After laboratory confirms the completion of corrective actions, Final Assessment of the laboratory shall be organized by NABL 66

Process Of NABL Certification Stage III NABL Secretariat shall organize the Final Assessment at the laboratory site(s) for its compliance to NABL Criteria and for that purpose appoint an assessment team The Assessment Team shall comprise of a Lead Assessor and other Technical Assessor(s) in the relevant fields depending upon the scope to be assessed Assessors shall raise the Non-Conformance(s), if any, and provide it to the laboratory in prescribed format so that it gets the opportunity to close as many Non-Conformance(s)as they can before closing meeting of the Assessment The Lead Assessor will provide a copy of consolidated report of the assessment to the laboratory and send the original copy to NABL Secretariat Laboratory shall take necessary corrective action on the remaining Non-Conformance(s) /other concerns and shall submit a report to NABL within a maximum period of 2 months 67

Process Of NABL Certification Stage IV After satisfactory corrective action by the laboratory, the Certification Committee examines the findings of the Assessment Team and recommend additional corrective action, if any, by the laboratory. Certification Committee determines whether the recommendations in the assessment report is consistent with NABL requirements as well as commensurate with the claims made by the laboratory in its application Laboratory shall have to take corrective action on any concerns raised by the Certification Committee Certification Committee shall make the appropriate recommendations regarding certification of a laboratory to NABL Secretariat Laboratories are free to appeal against the findings of assessment or decision on certification by writing to the Director, NABL Whenever possible NABL will depute its own technical personnel to be present at the time of assessment as Coordinator and NABL Observer. Sometimes, NABL may at its own cost depute a newly trained Technical Assessor as "Observer" subject to convenience of the laboratory to be accessed 68

Process Of NABL Certification Stage V Certification to a laboratory shall be valid for a period of 2 years and NABL shall conduct periodical Surveillance of the laboratory at intervals of one year Laboratory shall apply for Renewal of certification to it at least 6 months before the expiry of the validity of certification 69

Standards A standard is a physical representation of a unit of measurement. The term ‘standard’ is applied to a piece of equipment having a known measure of physical quantity. Standards have been developed for all the fundamental units as well as some of the derived mechanical and electrical units. 70

Types of Standards International Standards (defined based on international agreement ) Primary Standards (maintained by national standards laboratories) Secondary Standards ( used by industrial measurement laboratories) Working Standards ( used in general laboratory) 71

International Standards The international standards are defined by international agreement. They represent certain units of measurement to the closest possible accuracy that production and measurement technology allow. These standards are periodically checked by absolute measurements in terms of the fundamental units. These standards are maintained at international bureau of weights and measure and are not available to the ordinary users for measurements. 72

International Standards International ohm: It is defined as the resistance offered by a column of mercury having a mass of 14.4521 grams, uniform cross-section areas length of 106.300 cm, to the flow of constant current at the melting point of ice. International ampere: It is an unvarying current, which when passed through a solution of silver nitrate in water deposits silver at the rate 0.00111800 grams/sec (g/s). 73

Primary Standards The primary (basic) standards are maintained by national standards laboratories in different parts of the world. A primary standard is a standard that is accurate enough that it is not calibrated by or subordinate to other standards. The main function of the primary standards is the calibration and verification of secondary standards. These standards are not available for use outside the national laboratories. 74

Secondary Standards Secondary standards are the basic reference standards used in industrial measurement laboratories. These standards are maintained by the particular involved industry and are checked locally against other reference standards in area. Secondary standards are generally sent to the international standards laboratories on a periodic basis for calibration and comparison against the primary standards. They are then returned to the industrial user with certification of their measured value in terms of the primary standard. 75

Working Standards Working standards are the principle tools of a measurement laboratory. They are used to check laboratory instruments for accuracy and performance. These standards are used to perform comparison measurements in industrial application. For example, manufacturers of components such as capacitors, resistors etc. use a standard called a working standard for checking the component values being manufactured 76

THANK YOU 77

78

79

80

81

82

83

84

85

86

87

88

89

90

91

92

93

94

95

96

97

98

99

100

101
Tags