Project control and process instrumentation

11,362 views 46 slides Mar 08, 2021
Slide 1
Slide 1 of 46
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46

About This Presentation

The Seven Core Metrics
Management Indicators
Quality Indicators
Life-Cycle Expectations
Pragmatic Software Metrics
Metrics Automation
Tailoring the Process
Process Discriminants
Example: Small-Scale Project Versus Large-scale Project


Slide Content

PROJECT CONTROL AND PROCESS
INSTRUMENTATION
•The Seven Core Metrics
Management Indicators
Quality Indicators
Life-Cycle Expectations
Pragmatic Software Metrics
Metrics Automation
•Tailoring the Process
Process Discriminants
Example: Small-Scale Project Versus Large-scale Project

Project Control and Process Instrumentation
Themodernsoftwaredevelopmentprocesstacklethecentralmanagementissuesof
complexsoftware:
1.Gettingthedesignrightbyfocusingonthearchitecturefirst
2.Managingriskthroughiterativedevelopment
3.Reducingthecomplexitywithcomponentbasedtechniques.
4.Makingsoftwareprogressandqualitytangiblethroughinstrumentedchange
management.
5.Automatingtheoverheadandbookkeepingactivitiesthroughtheuseof
round-tripengineeringandintegratedenvironments.
The goals of software metrics are
•Anaccurateassessmentofprogresstodate
•Insightintothequalityoftheevolvingsoftwareproduct
•Abasisforestimatingthecostandscheduleforcompletingtheproduct
withincreasingaccuracyovertime

THE SEVEN CORE METRICS
MANAGEMENT INDICATORS
1.Work and progress (work performed over time)
2.Budgeted cost and expenditures (cost incurred over time)
3.Staffing and team dynamics (personnel changes over time)
QUALITY INDICATORS
1.Change traffic and stability (change traffic over time)
2.Breakage and modularity (average breakage per change over time)
3.Rework and adaptability (average rework per change over time)
4.Mean time between failures (MTBF) and maturity (defect rate
over time)

THE SEVEN CORE METRICS
Thesevencoremetricsattributesincludethefollowing:
•Theyaresimple,objective,easytocollect,easytointerpret,andhard
tomisinterpret.
•Collectioncanbeautomatedandnonintrusive.
•Theyprovideforconsistentassessmentsthroughoutthelifecycleand
arederivedfromtheevolvingproductbaselinesratherthanfroma
subjectiveassessment.
•Theyareusefultobothmanagementandengineeringpersonnelfor
communicatingprogressandqualityinaconsistentformat.
•Theirfidelityimprovesacrossthelifecycle.

Seven Core Metrics Overview
METRIC PURPOSE PERSPECTIVES
Work and progress Iteration planning, plan vs. actuals,
management indicator
SLOC, function points, object
points, scenarios, test cases, SCOs
Budget cost and expendituresFinancial insight, plan vs. actuals,
management indicator
Cost per month, full-time staff per
month, percentage of budget
expended
Staffing and team dynamics Resource plan vs. actuals, hiring
rate, attrition rate
People per month added, people
per month leaving
Change traffic and stabilityIteration planning, management
indicator of schedule convergence
Software changes
Breakage and modularity Convergence, software scrap,
quality indicator
Reworked SLOC per change, by
type, by release/component/
subsystem
Rework and adoptability Convergence, software rework,
quality indicator
Average hours per change, by
type, by
release/component/subsystem
MTBF and maturity Test coverage/adequacy,
robustness for use, quality
indicator
Failure counts, test hours until
failure, by release/component!
subsystem

MANAGEMENT INDICATORS
Therearethreefundamentalsetsofmanagementmetrics:
technicalprogress,financialstatus,andstaffingprogress.
•Byexaminingtheseperspectives,managementcanassesswhetheraprojectison
budgetandonschedule.
•Financialstatusisverywellunderstood.Mostmanagersknowtheirresource
expendituresintermsofcostsandschedule.
•Conventionalprojectswhoseintermediateproductswereallpaperdocuments
reliedonsubjectiveassessmentsoftechnicalprogressormeasuredthenumberof
documentscompleted.
•Whilethesedocumentsdidreflectprogressinexpendingenergy,theywerenot
veryindicativeofusefulworkbeingaccomplished.
•Themanagementindicatorsincludestandardfinancialstatusbasedonanearned
valuesystem,objectivetechnicalprogressmetricstailoredtotheprimary
measurementcriteriaforeachmajorteamoftheorganization,andstaffing
metricsthatprovideinsightintoteamdynamics.

WORK AND PROGRESS -MANAGEMENT INDICATOR
Thevariousactivitiesofaniterativedevelopmentprojectcanbemeasuredby
definingaplannedestimateoftheworkinanobjectivemeasure,thentracking
progress(workcompletedovertime)againstthatplan.
Eachmajororganizationalteamshouldhaveatleastoneprimaryprogress
perspectivethatitismeasuredagainst.Forthestandardteamsperspectivesofthis
metricwouldbeasfollows:
•Software architecture team: use cases demonstrated
•Software development team: SLOC under baseline change management, SCOs
closed
•Software assessment team: SCOs opened, test hours executed, evaluation criteria
met
•Software management team: milestones completed

BUDGETED COST AND EXPENDITURES
-MANAGEMENT INDICATOR
•Planthenear-termactivities(lessthansixmonths)indetailandleavethefar-term
activitiesasroughestimatestoberefinedasthecurrentiterationiswindingdownand
planningforthenextiterationbecomescrucial.
•Financialperformancemeasurementprovideshighlydetailedcostandscheduleinsight.Its
majorweaknessforsoftwareprojectstoassessthetechnicalprogress(%complete)
objectivelyandaccurately.
•Modernsoftwareprocessesareamenabletofinancialperformancemeasurementthrough
anearnedvalueapproach.Thebasicparametersofanearnedvaluesystem:
•Expenditureplan:theplannedspendingprofileforaprojectoveritsplannedschedule.
•Actualprogress:technicalaccomplishmentrelativetotheplannedprogressunderlying
thespendingprofile.Inahealthyproject,theactualprogresstracksplannedprogress
closely.
•Actualcost:actualspendingprofileforaprojectoveritsactualschedule.Inahealthy
project,thisprofiletrackstheplannedprofileclosely.
•Earnedvalue:thevaluethatrepresentstheplannedcostoftheactualprogress.
•Costvariance:differencebetweentheactualcostandtheearnedvalue.Positivevalues
correspondtoover-budgetandnegativevaluescorrespondtounder-budgetsituations.
•Schedulevariance:differencebetweentheplannedcostandtheearnedvalue.Positive
valuescorrespondtobehind-schedulesituations;negativevaluescorrespondtoahead-
of-schedulesituations.

BUDGETED COST AND EXPENDITURES
-MANAGEMENT INDICATOR
•0 to 50%: content incomplete
•50%: draft content; author has completed first draft text and art
•65%: initial text baseline; initial text editing complete
•75%: reviewable baseline; text and art editing complete
•80%: updated baseline; cross-chapter consistency checked
•90%: reviewed baseline; author has incorporated external reviewer comments
•100%: final edit; editor has completed a final cleanup pass

STAFFING AND TEAM DYNAMICS
-MANAGEMENT INDICATOR
•Aniterativedevelopmentshouldstartwithasmallteamuntiltherisksintherequirements
andarchitecturehavebeensuitablyresolved.
•Dependingontheoverlapofiterationsandotherproject-specificcircumstances,staffing
canvary.Fordiscrete,oneof-a-kinddevelopmentefforts(suchasbuildingacorporate
informationsystem),thestaffingprofileinFigurewouldbetypical.

STAFFING AND TEAM DYNAMICS
-MANAGEMENT INDICATOR
Itisreasonabletoexpectthemaintenanceteamtobesmallerthanthedevelopment
teamforthesesortsofdevelopments.
Foracommercialproductdevelopment,thesizesofthemaintenanceand
developmentteamsmaybethesame.
Trackingactualversusplannedstaffingisanecessaryandwell-understood
managementmetric.Increasesinstaffcanslowoverallprojectprogressasnew
peopleconsumetheproductivetimeofexistingpeopleincominguptospeed.
Lowattritionofgoodpeopleisasignofsuccess.
Engineersarehighlymotivatedbymakingprogressingettingsomethingtowork;
thisistherecurringthemeunderlyinganefficientiterativedevelopmentprocess.
Ifthismotivationisnotthere,goodengineerswillmigrateelsewhere.Anincreasein
unplannedattrition-namely,peopleleavingaprojectprematurely-isoneofthemost
glaringindicatorsthataprojectisdestinedfortrouble.
Thecausesofsuchattritioncanvary,buttheyareusuallypersonneldissatisfaction
withmanagementmethods,lackofteamwork,orprobabilityoffailureinmeeting
theplannedobjectives.

QUALITY INDICATORS
1.Change traffic and stability (change traffic over time)
2.Breakage and modularity (average breakage per change over time)
3.Rework and adaptability (average rework per change over time)
4.Mean time between failures (MTBF) and maturity (defect rate over time)
1. CHANGE TRAFFIC AND STABILITY:
•Itisdefinedasthenumberofsoftwarechangeordersopenedandclosedoverthelifecycle
inthefigure.Thismetriccanbecollectedbychangetype,byrelease,acrossallreleases,
byteam,bycomponents,bysubsystem,andsoforth.

•Coupledwiththeworkandprogressmetrics,itprovidesinsightintothestabilityofthe
softwareanditsconvergencetowardstability(ordivergencetowardinstability).
•StabilityisdefinedastherelationshipbetweenopenedversusclosedSCOs.
•Thechangetrafficrelativetothereleasescheduleprovidesinsightintoschedule
predictability,whichistheprimaryvalueofthismetricandanindicatorofhowwellthe
processisperforming.
2BREAKAGEANDMODULARITY
•Breakageisdefinedastheaverageextentofchange,whichistheamountofsoftware
baselinethatneedsrework(inSLOC,functionpoints,components,subsystems,files,etc.).
•Modularityistheaveragebreakagetrendovertime.Forahealthyproject,thetrend
expectationisdecreasingorstableasinfigure.
•Thisindicatorprovidesinsightintothebenignormalignantcharacterofsoftwarechange.
•Inamatureiterativedevelopmentprocess,earlierchangesareexpectedtoresultinmore
scrapthanlaterchanges.
•Breakagetrendsthatareincreasingwithtimeclearlyindicatethatproductmaintainabilityis
suspect.

3REWORKANDADAPTABILITY:
•Reworkistheaveragecostofchange,whichistheefforttoanalyze,resolve,andretestall
changestosoftwarebaselines.
•Adaptabilityisthereworktrendovertime.Forahealthyproject,thetrendexpectationis
decreasingorstableasinfigure.
•Notallchangesarecreatedequal.Somechangescanbemadeinastaff-hour,whileothers
takestaff-weeks.Thismetricprovidesinsightintoreworkmeasurement.
•Inamatureiterativedevelopmentprocess,earlierchanges(architecturalchanges,which
affectmultiplecomponentsandpeople)areexpectedtorequiremorereworkthanlater
changes(implementationchanges).

4MTBFANDMATURITY
•MTBFistheaverageusagetimebetweensoftwarefaults.Inroughterms,MTBFis
computedbydividingthetesthoursbythenumberoftype0andtype1SCOs.
•MaturityistheMTBFtrendovertimeasinfigure.
•Earlyinsightintomaturityrequiresthataneffectivetestinfrastructurebeestablished.
•Conventionaltestingapproachesformonolithicsoftwareprogramsfocusedonachieving
completetestcoverageofeverylineofcode,everybranch.

•Softwareerrorscategorizedintotwotypes:deterministicandnondeterministic.
PhysicistscharacterizetheseasBohr-bugsandHeisen-bugs,respectively.
•Bohr-bugsrepresentaclassoferrorsthatalwaysresultwhenthesoftwareis
stimulatedinacertainway.Theseerrorscausedbycodingerrors,andchangesare
typicallyisolatedtoasinglecomponent.
•Heisen-bugsaresoftwarefaultsthatarecoincidentalwithacertainprobabilistic
occurrenceofagivensituation.Theseerrorsaredesignerrorsandtypicallyarenot
repeatableevenwhenthesoftwareisstimulatedinthesameapparentway.
•ToprovideadequatetestcoverageandresolvethestatisticallysignificantHeisen-
bugs,extensivestatisticaltestingunderrealisticandrandomizedusagescenariosis
necessary.
•Modern,distributedsystemswithnumerousinteroperatingcomponentsexecuting
acrossanetworkofprocessorsarevulnerabletoHeisen-bugs,whicharefarmore
complicatedtodetect,analyze,andresolve.
•Thebestwaytomatureasoftwareproductistoestablishaninitialtestinfrastructure
thatallowsexecutionofrandomizedusagescenariosearlyinthelifecycleand
continuouslyevolvesthebreadthanddepthofusagescenariostooptimizecoverage
acrossthereliability-criticalcomponents.

LIFE-CYCLE EXPECTATIONS
•Thequalityindicatorsarederivedfromtheevolvingproductratherthanfromthe
artifacts.
•Theyprovideinsightintothewastegeneratedbytheprocess.Scrapandrework
metricsareastandardmeasurementperspectiveofmostmanufacturing
processes.
•Theyrecognizetheinherentlydynamicnatureofaniterativedevelopment
process.Ratherthanfocusonthevalue,theyexplicitlyconcentrateonthetrends
orchangeswithrespecttotime.
•Thecombinationofinsightfromthecurrentvalueandthecurrenttrendprovides
tangibleindicatorsformanagementaction.

Default pattern of life-cycle metrics evolution

PRAGMATIC SOFTWARE METRICS
Basiccharacteristicsofagoodmetricare
1.Itisconsideredmeaningfulbythecustomer,manager,andperformer.
Ifanyoneofthesestakeholdersdoesnotseethemetricasmeaningful,itwillnotbeused.
"Thecustomerisalwaysright"isasalesmotto,notanengineeringtenet.
2.Itdemonstratesquantifiablecorrelationbetweenprocessperturbationsandbusiness
performance.
Theonlyrealorganizationalgoalsandobjectivesarefinancial:costreduction,revenue
increase,andmarginincrease.
3.Itisobjectiveandunambiguouslydefined.
Objectivityshouldtranslateintosomeformofnumericrepresentation(suchasnumbers,
percentages,ratios)asopposedtotextualrepresentations(suchasexcellent,good,fair,
poor).
Ambiguityisminimizedthroughwell-understoodunitsofmeasurement(suchasstaff-
month,SLOe,change,functionpoint,class,scenario,requirement),whicharesurprisingly
hardtodefinepreciselyinthesoftwareengineeringworld.
4.Itdisplaystrends.Thisisanimportantcharacteristic.
Understandingthechangeinametric'svaluewithrespecttotime,subsequentprojects,
subsequentreleases,andsoforthisanextremelyimportantperspective,especiallyfor
today'siterativedevelopmentmodels.Itisveryrarethatagivenmetricdrivesthe
appropriateactiondirectly.

PRAGMATIC SOFTWARE METRICS
5.Itisanaturalby-productoftheprocess.
Themetricdoesnotintroducenewartifactsoroverheadactivities;itisderived
directlyfromthemainstreamengineeringandmanagementworkflows.
6.Itissupportedbyautomation.
Experiencehasdemonstratedthatthemostsuccessfulmetricsarethosethatare
collectedandreportedbyautomatedtools,inpartbecausesoftwaretoolsrequire
rigorousdefinitionsofthedatatheyprocess.

METRICS AUTOMATION
•This automation improves management insight into progress and quality trends.
•Software project control panel (SPCP) maintains an on-line version of the status of
evolving artifacts.
•Provideadisplaypanel(Dashboard)thatintegratesdatafrommultiplesourcestoshowthe
currentstatusofsomeaspectoftheproject.
•Thepanelcansupportstandardfeaturessuchaswarninglights,thresholds,variablescales,
digitalformats,andanalogformatstopresentanoverviewofthecurrentstatus.
SPCPdefinesanddevelopsthefollowing:
•Metricsprimitives:indicators,trends,comparisons,andprogressions
•Agraphicaluserinterface:GUIsupportforasoftwareprojectmanagerroleand
flexibilitytosupportotherroles
•Metricscollectionagents:dataextractionfromtheenvironmenttoolsthatmaintainthe
engineeringnotationsforthevariousartifactsets
•Metricsdatamanagementserver:datamanagementsupportforpopulatingthemetric
displaysoftheGUIandstoringthedataextractedbytheagents
•Metricsdefinitions:actualmetricspresentationsforrequirementsprogress,design
progress,implementationprogress,assessmentprogressandotherprogressdimensions.
•Actors:typically,themonitorandtheadministrator

METRICS AUTOMATION
Monitor:definespanellayoutsfromexistingmechanisms,graphicalobjects,andlinkagesto
projectdata;queriesdatatobedisplayedatdifferentlevelsofabstraction
Administrator:installsthesystem;definesnewmechanisms,graphicalobjects,andlinkages;
handlesarchivingfunctions;definescompositionanddecompositionstructuresfordisplaying
multiplelevelsofabstraction
Indicatorsmaydisplaydatainformatsthatarebinary(suchasblackandwhite),tertiary
(suchasred,yellow,andgreen),digital(integerorfloat),orsomeotherenumeratedtype
(suchassun..sat,jan..dec).
Atrendgraphpresentsvaluesovertimeandpermitsupperandlowerthresholds.A
progressiongraphpresentselementsofprogressbetweenstatesandanearnedvalueis
associatedwitheachstate.

METRICS AUTOMATION
Example:SPCPforaproject.softwareprojectmanagerrolehasdefinedatop-leveldisplay
withfourgraphicalobjects.
1.Projectactivitystatus.Thegraphicalobjectintheupperleftprovidesanoverviewofthe
statusofthetop-levelWBSelements.Thesevenelementsarecodedasred,yellow,andgreen
toreflectthecurrentearnedvaluestatus.Greenrepresentsaheadofplan,yellowindicates
within10%ofplan,andredidentifieselementsthathaveagreaterthan10%costorschedule
variance.Thisgraphicalobjectprovidesseveralexamplesofindicators:tertiarycolors,the
actualpercentage,andthecurrentfirstderivative.

METRICS AUTOMATION
2.Technicalartifactstatus.Thegraphicalobjectintheupperrightprovidesanoverviewof
thestatusoftheevolvingtechnicalartifacts.TheReqlightwoulddisplayanassessmentofthe
currentstateoftheusecasemodelsandrequirementsspecifications.TheDeslightwoulddo
thesameforthedesignmodels,theImplightforthesourcecodebaseline,andtheDeplight
forthetestprogram.
3.Milestoneprogress.Thegraphicalobjectinthelowerleftprovidesaprogressassessment
oftheachievementofmilestonesagainstplanandprovidesindicatorsofthecurrentvalues.
4.Actionitemprogress.Thegraphicalobjectinthelowerrightprovidesadifferent
perspectiveofprogress,showingthecurrentnumberofopenandclosedissues.
Basic operational concept for an SPCP to monitor the control panel:
1.TheSPCPstartsandshowsthemostcurrentinformation.
2.Theuserselectsfromalistofpreviouslydefineddefaultpanelpreferences.
3.Theuserselectswhetherthemetricshouldbedisplayedforagivenpointintimeorinagraph,asa
trend.Thedefaultfortrendsismonthly.
4.Theuserpointstoagraphicalobjectandrequeststhatthecontrolvaluesforthatmetricandpointin
timebedisplayed.
5.Pointstoagraphicalobjectdisplayingapointintimeanddrillsdowntoviewthetrendforthemetric.
6.Pointstoagraphicalobjectdisplayingatrendanddrillsdowntoviewthevaluesforthemetric.
7.Theuserpointstoagraphicalobjectdisplayingapointintimeanddrillsdowntoviewthenextlevel
ofinformation.
8.Theuserpointstoagraphicalobjectdisplayinganindicatoranddrillsdowntoviewthebreakdown
ofthenextlevelofindicators.

Tailoring the Process
•Process Discriminants
•Example: Small-Scale Project Versus Large-scale Project

Process Discriminants
•Twoprimarydimensionsofprocessvariability:technicalcomplexityand
managementcomplexity
•Theformalityofreviews,thequalitycontrolofartifacts,theprioritiesof
concerns,andnumerousotherprocessinstantiationparametersare
governedbythepointaprojectoccupiesinthesetwodimensions.
•Aprocessframeworkismustbeinjected,andthemethods,techniques,
culture,formality,andorganizationmustbetailoredtothespecific
domaintoachieveasuccess.
Theprojectprocessesisorganizedaroundsixprocessparameters:
1.Scale,
2.StakeholderCohesion,
3.Processflexibility,
4.ProcessMaturity,
5.ArchitecturalRisk,
6.Domainexperience

Higher technical complexity
•Embedded, real-time, distributed,
fault-tolerant
•High-performance, portable
•Unprecedented, architecture re-
engineering
Average software project
•5 to 10 people
•10 to 12 months
•3 to 5 external interfaces
•Some unknowns, risks
Process Discriminants
Figure shows Two primary dimensions of process variability: technical complexityand
management complexity
Lower technical complexity
•Straightforward automation, single thread
•Interactive performance, single platform
•Many precedent systems, application re-engineering
Higher
management
complexity
•Large scale
•Contractual
•Many
stakeholders
•“Projects”
Lower management
complexity
•Smaller scale
•Informal
•Few stakeholders
•“Products”

Higher technical complexity
•More domain experience required
•Longer inception and elaboration phases
•More iterations for risk management
•Less-predictable costs and schedules
Priorities for tailoring the process framework
Lower technical complexity
•More emphasis on existing assets
•Shorter inception and elaboration phases
•Fewer iterations
•More-predictable costs and schedules
Higher
management
complexity
•More emphasis
on risk
management
•More process
formality
•More emphasis
on teamwork
•Longer inception
and elaboration
phases
Lower management
complexity
•Less emphasis on risk
management
•Less process formality
•More emphasis on
individual skills
•Longer production and
transition phases

1.Scale
Mostimportantfactorintailoringasoftwareprocessframeworkisthetotalscaleofthe
softwareapplication.Manywaystomeasurescale,includingnumberofsourcelinesofcode,
numberoffunctionpoints,numberofusecases,andnumberofdollars.
Fromaprocesstailoringperspective,theprimarymeasureofscaleisthesizeoftheteam.As
theheadcountincreases,theimportanceofconsistentinterpersonalcommunicationsbecomes
paramount.
Fivepeopleisanoptimalsizeforanengineeringteam.Manystudiesindicatethatmostpeople
canbestmanagefourtoseventhingsatatime.
Differentmanagementapproachesneededtomanageateamof1(trivial),ateamof5(small),
ateamof25(moderate),ateamof125(large),ateamof625(huge),andsoon.
Asteamsizegrows,anewlevelofpersonnelmanagementisintroducedatroughlyeach
factorof5.
Trivial-sizedprojectsrequirealmostnomanagementoverhead(planning,communication,
coordination,progressassessment,review,administration).Thereislittleneedtodocument
theintermediateartifacts.Workflowissingle-threaded.Performanceishighlydependenton
personnelskills.
Smallprojects(5people)requireverylittlemanagementoverhead,butteamleadership
towardacommonobjectiveiscrucial.Communicationisneededamongteammembers.
Projectmilestonesareeasilyplanned,informallyconducted,andeasilychanged.Thereisa
smallnumberofindividualworkflows.Performancedependsprimarilyonpersonnelskills.
Processmaturityisrelativelyunimportant.Individualtoolscanhaveaconsiderableimpacton
performance.

Moderate-sizedprojects(25people)requiremoderatemanagementoverhead,includinga
dedicatedsoftwareprojectmanagertosynchronizeteamworkflowsandbalanceresources.
Overheadworkflowsacrossallteamleadsarenecessaryforreview,coordination,and
assessment.
•Thereisadefiniteneedtocommunicatetheintermediateartifactsamongteams.
•Projectmilestonesareformallyplannedandconducted,andtheimpactsofchangesare
typicallybenign.Thereisasmallnumberofconcurrentteamworkflows,eachwith
multipleindividualworkflows.
•Performanceishighlydependentontheskillsofkeypersonnel,especiallyteamleads.
Processmaturityisvaluable.Anenvironmentcanhaveaconsiderableimpacton
performance,butsuccesscanbeachievedwithcertainkeytoolsinplace.
Largeprojects(125people)requiresubstantialmanagementoverhead,includingadedicated
softwareprojectmanagerandseveralsubprojectmanagerstosynchronizeproject-leveland
subproject-levelworkflowsandtobalanceresources.
•Thereissignificantexpenditureinoverheadworkflowsacrossallteamleadsfor
dissemination,review,coordination,andassessment.Intermediateartifactsareexplicitly
emphasizedtocommunicateengineeringresultsacrossmanydiverseteams.
•Projectmilestonesareformallyplannedandconducted,andchangestomilestoneplansare
expensive.Largenumbersofconcurrentteamworkflowsarenecessary,eachwithmultiple
individualworkflows.
•Performanceishighlydependentontheskillsofkeypersonnel,especiallysubproject
managersandteamleads.

Projectperformanceisdependentonaveragepeople,fortworeasons:
1.Therearenumerousmundanejobsinanylargeproject,especiallyintheoverhead
workflows.
2.Theprobabilityofrecruiting,maintaining,andretainingalargenumberofexceptional
peopleissmall.
Hugeprojects(625people)requiresubstantialmanagementoverhead,includingmultiple
softwareprojectmanagersandmanysubprojectmanagerstosynchronizeproject-leveland
subproject-levelworkflowsandtobalanceresources.
•Thereissignificantexpenditureinoverheadworkflowsacrossallteamleadsfor
dissemination,review,coordination,andassessment.Intermediateartifactsareexplicitly
emphasizedtocommunicateengineeringresultsacrossmanydiverseteams.
•Projectmilestonesareveryformallyplannedandconducted,andchangestomilestone
planstypicallycausemalignantre-planning.Thereareverylargenumbersofconcurrent
teamworkflows,eachwithmultipleindividualworkflows.Performanceishighly
dependentonthe
•skillsofkeypersonnel,especiallysubprojectmanagersandteamleads.
•Softwareprocessmaturityanddomainexperiencearemandatorytoavoidrisksandensure
synchronizationofexpectationsacrossnumerousstakeholders.
•Amature,highlyintegrated,commonenvironmentacrossthedevelopmentteamsis
necessarytomanagechange,automateartifactproduction,maintainconsistencyamongthe
evolvingartifacts,andimprovethereturnoninvestmentofcommonprocesses,common
tools,commonnotations,andcommonmetrics.

Table: Process discriminators that result from differences in project size

2. STAKEHOLDER COHESION OR CONTENTION
•Thedegreeofcooperationandcoordinationamongstakeholders(buyers,developers,
users,subcontractors,andmaintainers,amongothers)cansignificantlydrivethespecifics
ofhowaprocessisdefined.
•Cohesiveteamshavecommongoals,complementaryskills,andclosecommunications.
Adversarialteamshaveconflictinggoals,competingorincompleteskills,andless-than-
opencommunications.
•Aproductthatisfunded,developed,marketed,andsoldbythesameorganizationcanbe
setupwithacommongoal(forexample,profitability).Asmall,collocatedorganization
canbeestablishedthathasacohesiveskillbaseandexcellentday-to-daycommunications
amongteammembers.
•Itismuchmoredifficulttosetupalargecontractualeffortwithoutsomecontentionacross
teams.Adevelopmentcontractorrarelyhasallthenecessarysoftwareordomainexpertise
andfrequentlymustteamwithmultiplesubcontractors,whohavecompetingprofitgoals.
•Fundingauthoritiesanduserswanttominimizecost,maximizethefeatureset,and
acceleratetimetomarket,whiledevelopmentcontractorswanttomaximizeprofitability.
•Largeteamsarealmostimpossibletocollocate,andsynchronizingstakeholder
expectationsischallenging.Allthesefactorstendtodegradeteamcohesionandmustbe
managedcontinuously.

Table: Process discriminators that result from differences in stakeholder cohesion

3. PROCESS FLEXIBILITY OR RIGOR
•Thedegreeofrigor,formality,andchangefreedominherentinaspecificproject's
"contract"(visiondocument,businesscase,anddevelopmentplan)willhaveasubstantial
impactontheimplementationoftheproject'sprocess.
•Forveryloosecontractssuchasbuildingacommercialproductwithinabusinessunitofa
softwarecompany,managementcomplexityisless.Inthis,processes,features,timeto
market,budget,qualitycanfreelytradedoffandchangedwithlittleoverhead.
•Forexample,ifacompanywantedtoeliminateafewfeaturesinaproductunder
developmenttocapturemarketsharefromthecompetitionbyacceleratingtheproduct
release,itwouldbefeasibletomakethisdecisioninlessthanaweek.Theentire
coordinationeffortmightinvolveonlythedevelopmentmanager,marketingmanager,and
businessunitmanagercoordinatingsomekeycommitments.
•Foraveryrigorouscontract,itcouldtakemanymonthstoauthorizeachangeinarelease
schedule.
•Forexample,toavoidalargecustomdevelopmenteffort,itmightbedesirableto
incorporateanewcommercialproductintotheoveralldesignofanext-generationair
trafficcontrolsystem.Thisrequirecoordinationamongthedevelopmentcontractor,
fundingagency,users,certificationagencies,associatecontractorsforinterfacingsystems.
•Large-scale,catastrophiccost-of-failuresystemshaveextensivecontractualrigorand
requiresignificantlydifferentmanagementapproaches.

Table: Process discriminators that result from differences in process flexibility

4. PROCESS MATURITY
•Theprocessmaturitylevelofthedevelopmentorganization,asdefinedbythe
SoftwareEngineeringInstitute'sCapabilityMaturityModelisanotherkey
driverofmanagementcomplexity.
•Managingamatureprocess(level3orhigher)isfarsimplerthanmanagingan
immatureprocess(levels1and2).
•Organizationswithamatureprocesstypicallyhaveahighlevelofprecedent
experienceindevelopingsoftwareandahighlevelofexistingprocesscollateral
thatenablespredictableplanningandexecutionoftheprocess.
•Thissortofcollateralincludeswelldefinedmethods,processautomationtools,
trainedpersonnel,planningmetrics,artifacttemplates,andworkflowtemplates.
•Tailoringamatureorganization‘sprocessforaspecificprojectisgenerallya
straightforward

Table: Process discriminators that result from differences in process maturity

5. ARCHITECTURAL RISK
•Someofthemostimportantandrecurringsourcesaresystemperformance
(resourceutilization,responsetime,throughput,accuracy),robustnesstochange
(additionofnewfeatures,incorporationofnewtechnology,adaptationto
dynamicoperationalconditions),andsystemreliability(predictablebehavior,
faulttolerance).
•Thedegreetowhichtheseriskscanbeeliminatedbeforeconstructionbeginscan
havedramaticramificationsintheprocesstailoring.

6. DOMAIN EXPERIENCE
•Domainexperiencegovernsitsabilitytoconvergeonanacceptablearchitecture
inaminimumnumberofiterations.Anorganizationthathasbuiltfive
generationsofradarcontrolswitchesmaybeabletoconvergeonanadequate
baselinearchitectureforanewradarapplicationintwoorthreeprototyperelease
iterations.
•Askilledsoftwareorganizationbuildingitsfirstradarapplicationmayrequire
fourorfiveprototypereleasesbeforeconvergingonanadequatebaseline.

Example: Small-Scale Project vs. Large-Scale Project
•someofthedimensionsofflexibility,priority,fidelitythatcanchangewhena
processframeworkisappliedtodifferent(small/large)applications,projects,and
domains.
•Tableillustratesthedifferencesinscheduledistributionforlargeandsmallprojects
acrossthelife-cyclephases.
•Asmallcommercialproject(forexample,a50,000source-lineVisualBasic
Windowsapplication,builtbyateamoffive)mayrequireonly1monthof
inception,2monthsofelaboration,5monthsofconstruction,and2monthsof
transition.
•A large, complex project (for example, a 300,000 source-line embedded avionics
program, built by a team of 40) could require 8 months of inception, 14 months of
elaboration, 20 months of construction, and 8 months of transition.

Example: Small-Scale Project vs. Large-Scale Project
•someofthedimensionsofflexibility,priority,fidelitythatcanchangewhena
processframeworkisappliedtodifferent(small/large)applications,projects,and
domains.
•Tableillustratesthedifferencesinscheduledistributionforlargeandsmallprojects
acrossthelife-cyclephases.
•Asmallcommercialproject(forexample,a50,000source-lineVisualBasic
Windowsapplication,builtbyateamoffive)mayrequireonly1monthof
inception,2monthsofelaboration,5monthsofconstruction,and2monthsof
transition.
•A large, complex project (for example, a 300,000 source-line embedded avionics
program, built by a team of 40) could require 8 months of inception, 14 months of
elaboration, 20 months of construction, and 8 months of transition.

Example: Small-Scale Project vs. Large-Scale Project
•The success or failure of the project reflects the importance of staffing or the
level of associated risk management.
Differences in workflow priorities between small and large projects
Rank Small Commercial Project Large Complex Project
1
Design Management
2
Implementation Design
3
Deployment Requirements
4
Requirements Assessment
5
Assessments Environment
6
Management Implementation
7
Environment Deployment

Example: Small-Scale Project vs. Large-Scale Project
Thefollowinglistelaboratessomeofthekeydifferencesin
discriminatorsofsuccess.
•Designiskeyinbothdomains.Gooddesignofacommercialproductisakey
differentiatorinthemarketplaceandisthefoundationforefficientnewproduct
releases.Gooddesignofalarge,complexprojectisthefoundationforpredictable,
cost-efficientconstruction.
•Managementisparamountinlargeprojects,wheretheconsequencesofplanning
errors,resourceallocationerrors,inconsistentstakeholderexpectations,andother
out-of-balancefactorscanhavecatastrophicconsequencesfortheoverallteam
dynamics.Managementisfarlessimportantinasmallteam,whereopportunities
formiscommunicationsarefewerandtheirconsequenceslesssignificant.
•Deploymentplaysafargreaterroleforasmallcommercialproductbecausethere
isabroaduserbaseofdiverseindividualsandenvironments.

Differences in artifacts between small and large projects

Reference
•Software Project Management -A Unified Framework
Walker Royce