ATheoryoftheGracefulComplexificationofConceptsandTheirLearnability
Universit´edeReimsChampagneArdenne,France
Conceptualcomplexityisassessedbyamulti-agentsystemwhichistestedexperimentally.
Inthismodel,whereeachagentrepresentsaworkingmemoryunit,conceptlearningisaninter-agentcommunicationprocessthatpromotestheelaborationofcommonknowledgefromdistributedknowledge.Ourhypothesisisthataconcept’slevelofdifficultyisdeterminedbythatofthemulti-agentcommunicationprotocol.Threeversionsofthemodel,whichdifferaccordingtohowtheycomputeentropy,aretestedandcomparedtoFeldman’smodel(Nature,2000),wherelogicalcomplexity(i.e.,themaximalBooleancompressionofthedisjunctivenormalform)isthebestpossiblemeasureofconceptualcomplexity.AllthreemodelsprovedsuperiortoFeldman’s:theserialversionisaheadby5.5pointsofvarianceinexplainingadultinter-conceptperformance.
FabienMathy∗andJo¨elBradmetz
Computationalcomplexitytheories(Johnson,1990;Las-saigne&Rougemont,1996)provideameasureofcomplex-ityintermsofthecomputationloadassociatedwithapro-gram’sexecutiontime.Inthisapproach,calledthestructural
approach,problemsaregroupedintoclassesonthebasisofthemachinetimeandspacerequiredbythealgorithmsusedtosolvethem.Aprogramisafunctionoracombinationoffunctions.Inviewofdevelopingpsychologicalmodels,itcanbelikenedtoaconcept,especiallywheny’sdomain[y=f(x)]isconfinedtothevalues0and1.Aneighboringperspective(Delahaye,1994)aimedatdescribingthecom-plexityofobjects(andnotatsolvingproblems)isusefulfordistinguishingbetweenthe“orderless,irregular,random,chaotic,random”complexity(thisquantityiscalledalgorith-miccomplexity,algorithmicrandomness,algorithmicinfor-mationcontentorChaitin-Kolmogorovcomplexity;Chaitin,
concerningthisarticleshouldbeaddressed
toFabienMathy,Centreinterdisciplinairederechercheenlin-guistiqueetpsychologiecognitive(CIRLEP),Universit´edeReimsChampagne-Ardenne,UFRLettresetSciencesHumaines,57ruePierreTaittinger,51096ReimsCedex,France(e-mail:fa-bien.mathy@free.fr.
∗Correspondence
1987;Kolmogorov,1965)andthe“organized,highlystruc-tured,information-rich”organizedcomplexity(orBennett’slogicaldepth,1986).Algorithmiccomplexitycorrespondstotheshortestprogramdescribingtheobject,whereasthelogicaldepthofthatprogramcorrespondstoitscomputa-tiontime.Workinginthisframework,MathyandBrad-metz(1999)wereabletoaccountforlearningandcatego-rizationintermsofcomputationalcomplexityandlogicaldepth.Theydevisedaconcept-complexitymetricusingamulti-agentsysteminwhicheachagentrepresentsoneunitinworkingmemory.Inthismodel,aconcept’sdimensionsarecontrolledbyinformationallyencapsulatedagents.Cat-egorizingamountstoelaboratingmutualknowledgeinthemulti-agentsystem.Thismodelofcomplexityisgroundedinananalysisofthecomplexityofthecommunicativepro-cessthatagentsmustcarryouttoreachastateofmutualknowledge.Lessconcernedwithpsychologicalmodelling,Feldman(2000)proposedatheoryhepresentsasinaugural(“...therelationbetweenBooleancomplexityandhumanlearninghasneverbeencomprehensivelytested”,p.631),arguingthatthecomplexityofaconceptisbestdescribedbythecompressionofitsdisjunctivenormalform(i.e.,adis-junctionofconjunctionsoffeatures).Feldmansupportedthisideawithanexperiment(afterlearning,recognizecomputerimagesofartificialamoebaeofagivenspeciesthatdifferinsize,shape,color,andnumberofnuclei),butnoalterna-1
2
¨BRADMETZFABIENMATHY∗ANDJOEL
tivemodelsweretested.Giventheinterestthismodelhas
arousedanditstemptingsimplicity,anexperimentalcom-parisonwithourmodelseemedindispensable.
Ourmodelofworkingmemorycommunicationscanbereadilyrelatedtotheworkingmemoryfunctionsdescribedinearlierresearch.Whenconceptualcomplexityisassessedusingamulti-agentmodel,workingmemorycanbejointlydescribedbythenumberofunitsrequiredandbythecom-municationoperationstheymustcarryout.Workingmem-orycommunicationscorrespondtotheoperationscontrolledbytheexecutivefunction;thenumberofunitssimplycorre-spondstothestoragecapacity.Theprocessor,orexecutivefunction,hasbeendescribedasaresidualdomainofigno-rance(Baddeley,1986,p.225),althoughithasbeenwidelystudiedinpsychometricsforitsassumedrelationshipswithfactorg(Crinella&Yu,2000).Thetask-switchingparadigm,forexample,hasbeenusedtodirectlystudyexecutivecon-trol(Gilbert&Shallice,2002).Forinstance,inAnderson’s(1983)adaptivecharacterofthoughttheory(ACT),thecom-putationalaspectsofruleprocessingareopposedtodeclara-tiveelementslikefactsandgoals;seealsoAnderson’s(1993)revisedtheory,ACT-R,andNewell’s(1992)rule-productionmodel,SOAR.However,inbothmodels,nocapacitylimitissetforworkingmemory.InBaddeley’s(1976;1986;1990,1992)well-knownmodel,wefindfunctionalindependencebetweentheprocessor(limitedcapacity)andmemoryspan(alsolimited),butmostresearchbasedonthismodelhasbeenconfinedtomeasuringmemoryspan.Similarly,inhisneo-Piagetianstudies,Pascual-Leone(1970)reducedtheex-ecutivecomponentofworkingmemorytothecapacityformaintainingatleastoneunitinworkingmemory.Weshallseethatthemulti-agentmodelsdevelopedhereincorporateboththeexecutivecomponentandthestoragecomponent.Thepurposeofthepresentarticleistoprovideanin-depthpresentationofatheoryofgracefulcomplexificationofconcepts(i.e.,continuousandnon-saltatory/non-stepwise;seeMcClelland,Rumelhart,&PDPResearchGroup,1986)brieflyaddressedinMathy&Bradmetz(1999),andtocom-pareittoFeldman’s(2000)theory.LikeFeldman,werepre-sentconceptsashavingacomponent-likestructureobtainedbyconcatenatingaseriesoffeatures(binaryforthesakeofsimplicity;n-arywouldsimplyleadtoacombinatoryburst).WealsoanalyzeFeldman’smodelandpointoutfiveofitsshortcomings.Thenwetestandcomparethetwomodelsexperimentally.
Settingasidephilosophicalproblemsattachedtoaviewofconceptsthatrestsprimarilyonthecompositionoftheirun-derlyingdimensionsandthepossibilityofexpressingtheminaBooleanfashion(forathoroughdiscussionofthispoint,seeFodor,1994),weretainthepossibilityof(i)usingasetofexamplesdefinedbythenumberofdimensions(e.g.,inthreebinarydimensions,shape(roundorsquare),color(blueorred)andsize(bigorsmall),thespaceofexamplescon-tains23or8elements),(ii)usingaBooleanexpressiontodefinesubclassesofpositiveandnegativeexamplesthatre-spectivelydoordonotrepresenttheconcept,and(iii)as-signingeachconceptafunctionthatseparatesthepositiveandnegativeexamples(Figure1).Thisviewwasinitiatedby
Figure1.Illustrationofathree-dimensionalconcept.
Bourne(1970);Bruner,GoodnowandAustin(1956);Levine(1966);Shepard,HovlandandJenkins(1961).
Thepositivecasesareshownasablackdot.Supposetheexamplesaregeometricalobjects.Theupperandlowerfacesaresmallandbigobjects,theleftandrightfacesareroundandsquareobjects,andthefrontandbackfacesareredandblueobjects.Usingclassicalformalismofproposi-tionallogic,wesaythatXisapositiveexampleif:
X≡111∨110∨010∨000≡(11∗)∨(0∗0)
Inotherwords,XisapositiveexampleifXis(Round∧Small)∨(Square∧Red)(i.e.,innaturallanguage:(RoundandSmall)or(SquareandRed)).
Theinformationcontentofasampleofexamplesisthenreducedtoasetofpertinentinformation,inordertogetaneconomicwayofdescribingaconcept.Learninginthiscasecanbeassimilatedtoacompressionofinformation.Thede-greeofthiscompressiondefinesthecompressibilityoftheconcept.Forexample,thelearnercanreduceaconcepttoarule.Themulti-agentmodelproposedherehasbeendevel-opedtomodelthecompressionofaconcept.Weassumethateachdimensionofaconceptisidentifiedbyasingleagent.Whencommunicatingagentsexchangeminimalin-formationtolearnaconcept,wemeasurethecompressionoftheminimalcommunicationprotocoltheyused.Thismin-imalcommunicationprotocolcancorrespondtotheformal-ismofpropositionallogicwithitsuseofdisjunctiveandcon-junctiveoperators.Theobjectiveofthemulti-agentmodelistodescribeseveralwaystoobtainacompressedpropo-sitionallogicformula.Becausethemulti-agentsystemisaninstantiationofaworkingmemorymodel,theseseveralwaystoinduceaconceptwillbedevelopedaccordingtodifferentconsiderationsofworkingmemory.Themodelispresentedindetailinappendices1until4.
Parallel,Serial,andRandomVersionsoftheMulti-AgentModel
Parallelversion.Theappendix4givesadetailedpresen-tationofhowmulti-agentprocessingworks.Thecube-vertexnumberingsystemusedthroughoutthisarticleisgiveninFigure1B.Inthispresentation,itisassumedthatentropy,i.e.,theamountofinformationsupplied(seeTables7,8,and9),iscalculatedforeachexamplepresentation.This
COMPLEXITYOFCONCEPTSANDLEARNABILITY
3
Table1
CostofeachexampleinFigure1,fortherandomversion.
ExamplesOrder12345678TotalFSC2223233320FCS3332322220SFC2223233320SCF3223322320CFS3332322220CSF3223322320Total1614141616141416120
isastronghypothesisthatinducesaparallelversionofthemodel.
Serialversion.Onecanalsoassumethatentropyiscal-culatedonceandforallforagivenconcept,andthateachexamplepresentedwillbeanalyzedusingthesamecommu-nicationprotocol.Inthiscondition,theorderoftheagentsisidenticalforallexamplesoftheconcept.Thisisthewayatraditionalsystemofrulesoperatesbecausecommunicat-ingoccursinaserialmannerandinafixedorder.Tomakethemodel’sidentificationprocessclearer,eachformulacanberepresentedbyadecisiontreethatgivestheorderofthespeakers.Figure2presentsthesetrees,whicharereadasfollows:therootofthetreerepresentsthechoicetobemadebythefirstspeaker,AgentX,thebranchesdownthenextnodeareY’s,andthebranchesdownthelowestnodeareZ’s.Alookatthesimplestandmosttypicalcases(1,2,6,13)willmaketheotherseasiertounderstand.
Randomversion.Inthethirdversionofthemodel,calledrandom,itissimplyassumedthatforeachexamplepre-sented,theagentsexpressthemselvesinarandomorder1.Eachconceptcanbeassignedanumberwhichisthemeannumberofagentsthathavetoexpressthemselvesinordertoidentifytheeightexamples.Asintheotherversions,positiveandnegativeexamplesaretreatedequally.Inillustration,letusagainconsiderthecaseinFigure1withthevertexnum-bersshowninFigure1B.Foreachvertex,therearesixpossi-blespeakingordersamongtheagentsForm,SizeandColor:
FSC,FCS,SFC,SCF,CFS,CSF.
Foreachoftheeightverticesandthesixorders,theiden-tificationprocesshasacost(cf.,Table1).Themeancostofidentifyingtheeightexamplesofagivenconcept(i.e.,thenumberofagentswhowillhavetospeak)is120/8=15,andthevarianceis1.Figure2givesthecostsofallconcepts.
CommunicationProtocols
IfweidentifyagentsbytheirspeakingturnsX,Y,Z,andweretaintheoperators∧(necessary)and[](optional),everyconceptcanbeassociatedwithaformulathatcorre-spondstoacommunicationprotocolandexpressesthetotalinformation-processingcostforitseightexamples.Concepts6,9,and12intheparallelversionofthemodelwillbeusedheretoillustratehowtheseformulasaredetermined.Each
positiveandnegativeexampleofConcept6possessesaho-mologononeside,sotwoagents8willalwayssufficetoiden-tifyit.WethereforewriteX∧Y8orsimplyX∧Y,wheretheabsenceofanexponentisequaltoanexponentof“8”bydefault.Thisformulameansthatfortheeightcasesoftheconcept,twoagentswillbenecessaryandsufficienteachtime.ForConcept9,wecanseethatallnegativeexampleshaveahomologononesidesotheycanbeidentifiedbytwoagents,butthepositiveexamplesareisolatedsotheywillneedthreeagents.ThisiswrittenX∧Y[Z]2,meaningthattwoagentswillalwaysbenecessary,andintwocasestheywillnotbesufficientandathirdagentwillhavetocontribute.LookingatConcept12,wefindthatfourcases(allnegative)arelocatedonasidewithahomologsotwoagentswillbenecessaryandsufficient,buttheotherfourareisolatedandthereforerequirethreeagents.ThisisdenotedX∧Y[Z]4.TheformulasfortheserialandparallelversionsaregiveninFigure2.
Feldman’sModel(2000)
Feldmanexaminespositiveexamplesonly,whichheenumeratesindisjunctivenormalform(DNF)(seealsoFeldman,2003,foracompleteclassificationofBooleanconceptsfromonetofourdimensions).IllustratingwiththeconceptinFigure1again,ifwedistinguishf(round),f(square),s(big),s(small),c(red),andc(blue),thepositiveexamplesarewritten
1=fsc,2=fsc,6=fsc,8=fsc
andtheconceptiswritten
(f∧s∧c)∨(f∧s∧c)∨(f∧s∧c)∨(f∧s∧c)Usingaheuristicthatisnotdescribedindepth,FeldmanstatesthatthemaximalcompressionofthisDNFwouldbec∧(s∧f)∨(c∧s∧f).Here,thenotationsystemofpropo-sitionallogic,basedonMorgan’slaw,isusedtogofromoneconnectivetotheotherbymeansofnegation(denotedbyanapostrophe):(s∧f)≡(s∨f).Thentheauthorcountsthenumberoflettersinthecompressedformulaanddrawsfromitthecomplexityindexforthatconcept.Apartfromitspredictivevalue,thismodelcallsforfiveremarks.
1.ItmovesimmediatelyfromtheDNFofaconcepttoitscomplexity,withouttryingtosupportthepsychologicalplausibilityofthistransition,thatis,withoutattemptingtocomeclosertoaplausiblewayoffunctioningforworkingmemory,evenarelativelyglobalandabstractone.
2.Forthemodeltobegrantedtheuniversalityitclaimstohave,certainchoicesmadeinelaboratingtheformulasneedtobedevelopedandsupported.WhywastheDNFchosen?Whatmakestheconnectives∧and∨superiortoothersformodellingcategorization?Whatcompressionheuristicsareused?Whyisanarbitrary,fixedorderemployedforenumer-atingthebinaryfeaturesoftheconcept?
1
ThisprincipleissimilarinsomewaystotheonedefendedbyPascual-Leone(1970),whousestheBose-Einsteinoccupancymodelofcombinatorialanalysistodescribetheactivationofworking-memoryunits.
4
¨BRADMETZFABIENMATHY∗ANDJOEL
3.Thereisanerrorinthemodel,relatedbothtotheopac-ityofthecompressiontechniqueandtothearbitraryor-deradoptedforconceptfeatures.Asimplemethodcalled
Karnaugh-Veitchdiagrams,whichcanbeeasilyappliedbyhand(Harris&Stocker,1998;V´elu,1999),indicatesthatConcept7(preciselytheoneusedasanexample)hasamaxi-malcompressionof4,not6astheauthorstates,sincethefor-mula(f∧s∧c)∨(f∧s∧c)∨(f∧s∧c)∨(f∧s∧c)canbereducedto(s∧f)∨(c∧f),whichdescribesthesmallroundexamplesandtheredsquareexamples.
4.Theauthorbrieflymentionsprototypetheoriesinhisconclusion,withoutindicatinghowhisconstructioncouldbeamodelofthem.
5.Itisnotclearwhypositiveandnegativeexamplesarenottreatedequallyandwhytheyevenconstituteacomplex-ityfactorindependentlyofBooleancomplexity.Thispointissimplylinkedtothemethodusedbytheauthorandisnotanintrinsiccharacteristicofconcepts.Inourmethod,asweshallsee,symmetricalprocessingofpositiveandnegativecaseseliminatesthissourceofvariation.
Theconcept-complexityindexesderivedfromFeldman’smodel(correctedforConcept7)aregiveninFigure2.Tofacilitatecomparisonwiththeotherindexes,wemultipliedthemby8.
ConceptLearnability
Aconceptisadiscriminantfunctioninaspaceofexam-ples.Thesimplestdiscriminationislinearseparability,thekindaperceptroncanachievewithoutahiddenlayer.Lin-earseparabilitysuppliesameasureofcomplexity,butitisinsufficientbecauseitperformsanundifferentiatedclassifi-cationthatputsawholerangeofcaseswithinteractionsofvariablecomplexityunderasinglelabel,inseparable.Themulti-agentmodelconvenientlyoffersthepossibilityofas-signingeachconceptaparticularBooleanfunction,which,providedoneisabletoorderthosefunctions–weshallseelaterhowalatticecansojustthat–considerablyenrichestheseparableversusnon-separablemodel.Inthiscase,ev-eryfunctionusedhasacorrespondingVapnik-Chervonenkisdimension(VC)(seeBoucheron,1992)andexhibitsbettercasediscrimination2.
Betweenlearningandknowledgeactivationthereisiden-tityofform.Whenalearningmastersuppliesanswers,thesubjectlearnstoidentifythestimulusandtoassociateitwithasubclass(positiveornegative);inotherwords,novicesdothesamethingaswhentheyknowtheconceptandactivatetheirownknowledge.Onecanthusassumelogicallythatthetimetakentoidentifyanexampleandassignittothecorrectsubclassoncetheconceptislearnedisproportionaltotheconcept’scomplexity,andalsothatthelearningtime,i.e.,thetimetakentomemorizethesubclasstowhicheachexamplebelongs,isproportionaltotheconcept’scomplexityaswell.
Ahierarchyofconceptcomplexityuptothreedimensionscanbeproposed,basedontheorderingofmulti-agentfor-mulasinaGaloislattice(Appendix3,Figure12).However,itismorepracticaltoassignanoverallcomplexityindexto
aconceptbytakingthesumofallcallstoallagentsthatoccurwhiletheconcept’seightexamplesarebeingidenti-fied.Theseindexes,whichindicatethelogicaldepthoftheconcept,aregiveninFigure2forthevariousversionsofthemodel.
Nowletuspresentthemethodthatenablesustocom-parethefourmeasuresofconceptualcomplexity:multi-agentserial,parallel,andrandomcomplexity,andFeldman’sBooleancomplexity.
METHOD
Subjects
Seventy-threeundergraduateandgraduatestudents(29menand44women)betweentheagesof18and29(meanage:21years7months)participatedintheexperiment.
Procedure
Acomputer-assistedlearningprogramwaswritten(avail-ableathttp://fabien.mathy.free.fr/).Astandardconcept-learningprotocolwasused:exampleswerepresentedinsuc-cessiontothesubject,whohadtosortthembyputtingthepositiveexamplesina“briefcase”andthenegativeexam-plesina“trashcan”.Feedbackwasgiveneachtime.Theexamplesweregeneratedfromthreedimensions:(i)shape(square,oval,orcross),(ii)color(red,blue,purple,orgreen),and(iii)thetypeofframearoundthecoloredshape(diamondorcircle).Manyshapesandcolorswereusedsothatoneachnewconcept,theexampleswouldlookdifferentenoughtoavoidinterferenceandconfusionwiththeprecedingcon-cepts.Foreachconcept,thedimensionswereofcourseusedonlyinabinarywayintheentiresetofexamples(e.g.,squarevs.ovalorsquarevs.cross).
Asetofexampleswascomposedofasubsetofpositiveexamples(ex+);itscomplementarysubsetwascomposedofnegativeexamples(ex−).Thispartitiondefinedthetargetconcept.Whentheimagedisplayedonthecomputerde-pictedanex+,thelearnerhadtoclickonthebriefcasedrawninawindowdesignedforthatpurpose;anex−hadtobeputinthetrashcanshowninanotherwindow.Clickingthemouseineitherofthesewindowscausedtheotherwindowtodisappearsothatthefeedbackwouldbeclearlyassoci-atedwiththeclickedwindow.Eachcorrectanswerwasre-wardedwitha“Bravo”heardanddisplayedinthefeedbackwindow.Whenanincorrectanswerwasgiven,amessagewasdisplayedsaying“Notinthebriefcase”or“Notinthetrashcan”,dependingonthecase.Allfeedbackmessagesre-mainedonthescreenfortwoseconds.A“Toolate”messagewasdisplayedaftereightsecondsifthesubjectstillhadnot
2
Stayingwithinthedomainoflinearseparabilityasitiscom-putedbyaperceptronwithoutahiddenlayer,theVCdimension,for3dimensions,isequalto4,thatis,theperceptroncanseparateallsubsetsof4verticesobtainedbybi-partitioningtheverticesofacube,providedthe4verticesindeeddefine3dimensionsandnotjustthe4cornersofasquare.Forcubeexamplesetswithmorethan4members,linearseparabilitystillmaybepossible,dependingonthecase,butitisnotnecessarilyso.
COMPLEXITYOFCONCEPTSANDLEARNABILITY
5
Figure2.Formulasanddecisiontreesforconceptsuptothreedimensions.Note.P:communicationprotocolformulafortheparallelmodel.S:communicationprotocolformulafortheserialmodel.PC,SC,RC,FC:complexityindexfortheparallel,serial,random,andFeldman(2000)models,respectively.(X’schoicesareshownasasolidline,Y’sasablackdottedline,andZ’sasagreydottedline).
6
¨BRADMETZFABIENMATHY∗ANDJOEL
Figure3.PossiblewaysofrepresentingBooleandimensionsas
physicaldimensions.Note.Representationa:numericalfacilita-tion.Representationb:spatialandnumericalfacilitation.Repre-sentationc:separabledimensionsareamalgamated.
clickedonthebriefcaseortrashcan,andthenextexamplewasdisplayed;thismessagekeptthegamegoingandavoidedunnecessaryuseoftime.Thetimelimitofeightsecondswasassessedbymeansofvarioussurveyswhichyieldedalateresponserateoflessthan1%.Eightsecondsisobviouslynotanabsoluteresponsetimebutsimplyservedhereasameansofputtingallsubjectsinthesamesituation.Theconcept-learningcriterionwasdeemedtobereachedwhenthesub-jectcorrectlysorted16consecutiveexamples.Everytimeacorrectanswerwasgiven,oneof16smallsquaresintheprogressbarwasfilledin(inblack).Subjectscouldthere-foreseetheirprogressatanytime.Threeblacksquareswereerasedifthesubjectrespondedtoolate.Amistakeerasedallpreviouslyfilled-insquaresandresetthelearningcounteratzero.
Allthreedimensionswererepresentedinasinglefigure(e.g.,anexamplecouldbearedsquarewithadiamondaroundit)inordertoavoidspatialrepresentations(incaseofanexamplerepresentedbythreepresentorabsentformsinarow).Spatialrepresentationswouldhavefacilitatedlearning(seeexamplesinFigure3).Thedimensionswerealsorepre-sentativeofseparabledimensionspermittingindependentdi-mensionprocessing(asopposedtointegraldimensions;seeGarner,1974).
Thedependentvariableswere(i)thetimetakentolearntheconcept(T),(ii)thetotalnumberofresponses(i.e.,thenumberofexamplesused)(R),and(iii)thenumberoferrorsmade(E).
RESULTS
Table2showsthatthecorrelationsbetweenthethreede-pendentvariablesweregreaterthan.95,p<.01.Thisfind-
Figure4.Boxplotsoflearningtimesforthe13concepts.
ingallowedustolooksolelyatthetotallearningtime,whichwasthemosthighlycorrelatedwiththecomplexityindexesforthefourmodels.Toavoidanyeffectsbroughtaboutbytheextremescores(whichcaneasilybeseeninFigure4),thelearningtimesforeachsubjectweretransformedintoranks(TR)byattributingranks1to13totheconcepts(rank13wasgiventotheconceptlearnedthefastestsothecoefficientsintheregressionanalyseswouldremainpositive).
Now,tocomparethefourmodelsandidentifythesourcesofvariationinperformance,wechosealinearregressionanalysis.Twoexplanatoryvariables(independentbycon-struction)wereselected:thepresentationrank(PR)ofaconcept(drawnatrandomduringtheexperiment)andthecomplexityindexofeachmodel(SC,PC,RC,andFC,forthecomplexityindexesoftheserial,parallel,random,andFeldmanmodels,respectively).Thepresentationrank(PR)wasincludedbecauselearningtimedecreasedastherankforlearningaconceptincreased(F(12,72)=9.7,p<.001),asindicatedinFigure5.Thedependentvariablewasthetotallearningtimerank(TR)oftheconceptamongthe13thateachsubjectlearned.Theregressionanalysiswasappliedtothefourcomplexityindexes.TheresultsarepresentedinFig-ure6.Theamountsofvarianceexplained(R2)bycomput-ingamultipleregressionontheconcept-presentationranks(PR)andoneachcomplexityindexwere.423,.418,.400,and.363forindexesSC,PC,RC,andFC,respectively.Alloftheanalysesofvarianceonthemultiplecorrelationsweresignificant(F(2,946)=347,340,315,and270,respectively,p<.001).Theresultsofthet-testsappliedtopairedcorre-lations(i.e.,totheregressionlinecoefficients)betweeneachofthecomplexityindexesandthelearningtimeranks(TR)aregiveninTable3.Themodelswereorderedasfollows:S>P>R>F.Thus,allthreemulti-agentmodelsturnedouttobesuperiortoFeldman’smodel(2000).Theserial,parallelandrandommodels(whichwerestatisticallyindis-tinguishable)aresignificantlybetterthanFeldman’smodel.
COMPLEXITYOFCONCEPTSANDLEARNABILITY
7
Table2
Correlationsbetweenthecomplexityindexesandthescores.
SCPCRC∗∗PC.958RC.841∗∗.940∗∗FC.925∗∗.957∗∗.843∗∗TR.576∗∗.574∗∗.559∗∗T.363∗∗.380∗∗.374∗∗R.351∗∗.367∗∗.359∗∗E.335∗∗.358∗∗.357∗∗
FCTRTR.533∗∗
.3∗∗.353∗∗.339∗∗
.6∗∗.637∗∗.612∗∗
.978∗∗.953∗∗
.965∗∗
Note.TR:totallearningtimerank.T:totaltime.R:numberofresponses.E:numberoferrors.SC:serialcomplexity.PC:parallelcomplexity.RC:randomcomplexity.FC:Feldman’scomplexity(2000).∗∗:correlationissignificantatthe0.01level(two-tailed).
Table3
ValuesofStudent’stbetweencorrelations.
rTR.PCrTR.RC
rTR.SC0.261.15rTR.PC1.63rTR.RC
rTR.FC4.16∗∗5.26∗∗1.73
Note.∗:significantatp<.05.∗∗:significantatp<.01.TR:learning-timerank.SC:serialcomplexity.PC:parallelcomplexity.RC:randomcomplexity.FC:Feldman’scomplexity(2000).Theset-valuesonnon-independentsampleswerecalculatedusingSteiger’s(1980)formula(seeHowell,1997,p.300).
Thesuperiorityoftheserialmodelpointsoutthemeritsofmodellinginformationprocessinginworkingmemoryusingmodelsthatprocessinformationinafixedorder.
Regressionsonthemeanlearningtimeswerealsocal-culated.Letuscomparethetwomodelsthatdifferedthemostastotheirpredictionoflearningtime(Feldman’smodelandtheserialmodel).Theamountofvarianceex-plainedbyFeldman’smodelinthisconditionwasgreater(R2=.815,F(1,11)=49,p<.001)thantheserialmodel(R2=.808,F(1,11)=46,p<.001),althoughthediffer-encebetweentheregressionlinecoefficientswasnonsignif-icant(t(11)=−0.8,NS).Inillustration,Figure7showstheregressionlinebetweenthemeanlearningtimesandtheserialcomplexityindexesoftheconcepts.Thisfindingin-dicatesthat,despitethegreaterreadabilityoftheresults,re-gressioncalculationsonmeanlearningtimes(thebasisofFeldman’scalculations,2000)donotpointoutwhichmodelsbestfitthedata3.
Figure5.Meanlearningtimeinseconds,bypresentationrankofthe13concepts.
DISCUSSION
OneofthegoalsofthisstudywastoevaluateFeldman’s(2000)modelofconceptualcomplexitywithrespecttoase-riesofmulti-agentmodelsdevelopedtobeanalogoustothefunctioningofworkingmemory.Theresultsshowedthatallthreemulti-agentmodelstestedaresuperiortoFeldman’s(2000)fortheregressionoflearning-timeoverthesetofthree-dimensionalconcepts.ThedifferencebetweenFeld-man’smodelandthemulti-agentmodelsliesnotonlyinthefactthatthelattertakethecomplexityofaconceptinto
accountintermsofpositiveandnegativeexamples(unlikeFeldman’smodelwhichlookssolelyatpositiveexamples),butalsointheirclarificationofdimensionprocessing.
Theinherentadvantageofmulti-agentmodelsisthattheyallowonetoaddressthequestionofthenatureofinformationprocessing(serial,parallel,orrandom).Ourresultsshowedthattheserialmodelisthebestmodelbecauseitimposesafixedinformation-processingorder.Onereasonwhyitpre-vailsovertheotherthreeiscertainlyduetorelativelycon-stantpatternswithinnounphrasesinnaturallanguages(e.g.,bigredroundinsteadofroundredbig).Thephrase’ssta-bilityseemstoberootedinstylisticconsiderations,whichimposeacertainorderwhenfeaturesarebeingenumerated.Thiswouldfixtheirorderduringstimulusrehearsal.Besides,inthethreemulti-agentmodelsdevelopedhere,thecom-Pascual-Leone(1970)obtainedexcellentfitsbetweentheoret-icalandexperimentalcurvesusingthissametechnique.Bytakinginterindividualvariabilityintoaccount,BradmetzandFloccia(sub-mitted)showedwithaLISRELmodelthatalargepartofthevari-anceisnotexplainedbyPascual-Leone’smodelandthatthedatafitshouldthereforeberejected.
3
8
¨BRADMETZFABIENMATHY∗ANDJOEL
Figure6.Linearregressionforthefourcomplexitymodels.Note.PR:presentationrank.TR:learning-timerank.SC:serialcomplexity.
PC:parallelcomplexity.RC:randomcomplexity.FC:Feldman’scomplexity(2000).
Figure7.Relationshipbetweenmeanlearningtimeandserialconcept-complexityindexes.
municationscorrespondingtotheexecutivepartofworkingmemoryprocessingisrepresentedbydecisiontrees.Because
itimposesanunchangeableorderfornodesatdifferenttreedepths,theserialmulti-agentsystemcanbereducedtoaproductionsystemlikethatfoundinthesymbolicapproachtoreasoning(Newell,1983,1990,1992;Newell&Simon,1972;Anderson,1983;Holland,Holyoak,Nisbett,&Tha-gard,1986).Inthelatterapproach,reasoningisbasedonruleswhosecomplexitydependsonthenestedhierarchiza-tionofasetofvariables(anideaalsofoundindevelopmen-talpsychologyinZelazo,Frye,&Rapus,1996).Ourparal-lelandrandommulti-agentmodels,however,gobeyondthetraditionalrepresentationindecision-treeformat,whichim-posesapredefinedorderonthedimensionhierarchy.Asfortherandommodel,ithasevenfewerorderconstraintsthantheparallelmodel:itdoesnotneedtocomputeentropysinceagentsrandomlyputintheirinformationuntiltheexamplesareidentified.Inthiscase,theamountofinformationneededonlydependsaposteriorionaconcept’sentropy(sincethesavingsintermsoffewerspeakingturnsstilldependsontheconcept’sstructure)andnotonanaprioricalculationoftheconcept’sentropy.Feldman’smodelmakesacleardistinc-tionbetweenthenumberofpiecesofinformation(connectedbyconjunctions)neededtoclassifyexamplesindisjunctiveformat(1,2,or3pieces).However,inadditiontoinforma-tionquantity,multi-agentmodelsdistinguishseveraloperat-
COMPLEXITYOFCONCEPTSANDLEARNABILITY
9
ingmodesbyintroducingtheideaofinformationordering.
Todrawananalogy,multi-agentmodelswouldnotonlyac-countforthenumberofdigits(orchunks)tomemorizeinatelephonenumber,butalsoproblemsoforderandsimul-taneityinchunkaddressingorretrieval.Thiskindofinfor-mationiscriticalbecause(dependingonthechosenparam-eters)itmaylowertheamountofinformationthathastobesuppliedtoclassifyanexample.Theamountofinformationrequiredperexampleisthereforenotanabsolutemeasurederivedfromdisjunctivenormalforms,butarelativemea-surethatdependsuponthetypeofprocessingcarriedoutontheinformation.
Anotherpointconcernstheoriginalityofrepresentingworkingmemoryprocessingintermsofcommunicationpro-tocolformulas.Theexpressivepowerofcommunicationfor-mulasfordescribingBooleanfunctionsisnogreaterthananyothertypeofformalization,buttheircompressibilityisgreater.Byreducingthedecision-treestructuretobinarycommunicationoperators,formulasofferacompressedrep-resentationofworkingmemoryoperations.Theagentswhogivethegreatestamountofinformationarechosenfirst.Oneadvantageofthisapproachisthatitbringsoutthecorre-spondencebetweentheentropymeasureandaclassificationofsimplecommunicationsituations(choice,simpleinter-actions,completeinteractions).Acommunicationformularepresentsthenecessarydimensionsonlyonceinitswrit-tenform.Thisformalsimplicitystemsfromthefactthateachcommunicationprotocolcorrespondstoaspeakingturncalculatedbyameasureofinformationgain(basedonanentropycalculation).Aformalizationlikethis(intermsofcommunicationformulas)issimplertoreadthanthedisjunc-tivenormalformsproposedbyFeldman.Forinstance,ourmulti-agentsystemmodelsConcept13asX∧Y∧Z,whereasFeldman’sformulaisx(yz∨yz)∨x(yz∨yz).Wealsofindacorrespondencebetweenthenumbernof∧operatorsandtheinteractionstructuresofordernthatexistinsituationswithn+1dimensions(e.g.theformulaX∧Y∧Zdescribingasecond-orderinteraction).
Thisstudystillhassomelimitationswhenitcomestoevaluatingtheparallel,serial,andrandommodels.Itisdiffi-culttovalidateoneofthethreemodelswiththeinter-conceptcomparisonmethoddevelopedhere.Mathy(2002)proposedconductingastudygroundedonanintra-conceptcompari-sonmethodthatwouldbebetterabletodistinguishthemod-els.Indeed,thewaymulti-agentmodelsfunctionissuchthatthereisadifferentnumberofagentsforeachexampleofaconcept.Byreadingthecommunicationprotocolformu-las(ortablesfortherandommodel),itsufficestoestablish,foreachexampleofaconcept,thenumberofagentsneededtoclassifyit.Anexamplethatrequiresmoreagentsinor-dertobecategorized(i.e.,onerepresentingalongerpathinthedecisiontree)willcorrespondtohigherresponsetimesinanapplicationphaseofanalready-learnedconcept.Onewouldnolongermeasurethetree-constructiontimebutthetimeneededtonavigateinaninducedtree.
RESUM´E
´Cetarticleproposeunmod`eleetune´evaluationexp´erimentale
delacomplexit´edesconceptsaumoyend’unsyst`ememulti-agentdanslequelchaqueagentrepr´esenteuneunit´eenm´emoiredetra-vail.Cemod`eleconc¸oitl’apprentissagedeconceptscommeuneactivit´edecommunicationinter-agentspermettantdepasserd’une
connaissancedistribu´eeexplicite`a
uneconnaissancecommune.L’hypoth`eseestqueledegr´ededifficult´ed’unetˆachedeconcep-tualisationestd´etermin´eparceluiduprotocoledecommunicationinter-agents.Troisversionsdumod`ele,diff´erantselonlemodedecalculdel’entropiedusyst`eme,sonttest´eesetsontcompar´eesaumod`elequeFeldman(Nature,2000)pr´esentecommed´efinitif,en
r´eduisantlacomplexit´ed’unconcept`a
lacompressionmaximaledesaformedisjonctivenormalebool´eenne.Lestroisversionsdumod`eleser´ev`elentsup´erieuresaumod`eledeFeldman:laversions´equentiellegagne5,5pointsdevariancedansl’explicationdesper-formancesinter-conceptsdesujetsadultes.
REFERENCES
Anderson,J.R.(1983).Thearchitectureofcognition.CambridgeMA,HarvardUniversityPress.Apostel,L.(1963).Structureetgen`ese.InL.Apostel,J-BGrize,S.Papert&J.Piaget.Lafiliationdesstructures.Paris:PressesUniversitairesdeFrance.Axelrod,R.(1997).Thecomplexityofcoopera-tion:Agent-basedmodelsofcompetitionandcollaboration.Princeton,NJ:PrincetonUniversityPress.
Baddeley,A.(1976).Thepsychologyofmemory.NewYork,NY:BasicBooks.
Baddeley,A.(1986).Workingmemory.Oxford:Claren-donPress.
Baddeley,A.(1990).Humanmemory:Theoryandprac-tice.Boston:AllynandBacon.
Baddeley,A.(1992).Workingmemory.Science,255,
556-559.(Frenchtranslation:Lam´
emoireHumaine.Greno-ble:PresseUniversitairedeGrenoble,1993).
Bennett,C.H.(1986).Onthenatureandoriginofcom-plexityindiscrete,homogeneous,locally-interactingsys-tems.Foundationsofphysics,vol.16,No.6,585-592.Boucheron,S.(1992).Th´eoriedel’apprentissage.Paris:Herm`es.
Bourne,L.E.Jr.(1970).Knowingandusingconcepts.PsychologicalReview,77,6-556.
Bradmetz,J.&Floccia,C.(submitted).ComparisonoffourinformationprocessingmodelsinPascual-Leone’sCSVItask.
Brazier,F.,Dunin-Keplicz,B.,Jennings,N.R.&Treur,J.(1995).FormalSpecificationofmulti-agentsystems:areal-worldcase.ProceedingsoftheFirstInternationalCon-ferenceofMulti-AgentSystems(pp.25-32).SanFrancisco,CA:MITPress.
Bruner,J.S.,Goodnow,J.J.,&Austin,G.A.(1956).Astudyofthinking.NewYork,NY:Wiley.
Bryant,R.E.(1986).Graph-basedalgorithmsforBooleanfunctionmanipulation.IEEETransactionsonCompilers,C-35(8).
10
¨BRADMETZFABIENMATHY∗ANDJOEL
Burmeister,B.&Sundermeyer,K.(1990).COSY:to-wardsamethodologyofMulti-AgentSystems.International
WorkingConferenceofCooperatingKnowledge-BasedSys-tems,Keele,UK.
Chaitin,G.J.(1987).Information,randomnessandin-completeness:papersonalgorithmicinformationtheory.NewYork,NY:WorldScientific.
Cowan,N.(2001).Themagicalnumber4inshort-termmemory:Areconsiderationofmentalstoragecapacity.Be-havioralandBrainScience,24,1,87-185.
Crabtree,B.,&Jennings,N.R.(Eds).(1996).Proceed-ingsofthe1stInternationalConferenceonPracticalAppli-cationsofIntelligentAgentsandMulti-AgentTechnology.Crinella,F.M.,&Yu,J.(2000).Brainmechanismsandintelligence.Psychometricgandexecutivefunction.Intelli-gence,27,4,299-327.
Davey,B.A.&Priestley,H.A.(1990).Introductiontolatticesandorder.CambridgeMA:CambridgeUniversityPress.
Delahaye,J.P.(1994).Information,complexit´eethasard.Paris:Herm`es.
Epstein,J.M.,&Axtell,R.(1996).Growingartificialso-cieties.CambridgeMA:MITPress.
Fagin,R.,Halpern,J.Y.,Moses,Y.,&Vardi,M.Y.(1995).Reasoningaboutknowledge.Cambridge,MA:MITPress.Feldman,J.(2000).MinimizationofBooleancomplexityinhumanconceptlearning.Nature,407,630-633.
Feldman,J.(2003).AcatalogofBooleanconcepts.Jour-nalofMathematicalPsychology,47,75-.
Ferber,J.,&Gutknecht,O.(1998).Ameta-modelforanalysisanddesignoforganizationsinMulti-AgentSystems.ThirdInternationalConferenceonMulti-AgentSystems,IC-MAS98,pp.128-135,Paris,France.
Ferber,J.(1999).Multi-AgentSystems,anintroductiontodistributedartificialintelligence.Harlow:Addison-Wesley.Fischer,K.W.(1980).Atheoryofcognitivedevelopment:thecontrolandconstructionofhierarchiesofskills.Psycho-logicalReview,87,477-531.
Fodor,J.(1994).Concepts:apotboiler.Cognition,50,95-113.
Ganter,B.&Wille,R.(1991).Appliedlatticetheory:for-malconceptanalysis,inG.Gr¨atzer(Ed),Generallatticethe-ory,Basel:Birkha¨user.
Garner,W.R.(1974).Theprocessingofinformationandstructure.Potomac,MD:LawrenceErlbaum.
Gell-Mann,M.(1994).Thequarkandthejaguar:Adven-turesinthesimpleandthecomplex.NewYork,NY:FreemanandCompany.
Gilbert,N.,&Conte,R.(1995).ArtificialSocieties.Lon-don:UCLPress.
Gilbert,S.J.&Shallice,T.(2002).Taskswitching:aPDPmodel.CognitivePsychology,44,297-337.
Halford,G.S.,Wilson,W.H.,&Phillips,S.(1998).Pro-cessingcapacitydefinedbyrelationalcomplexity:implica-tionsforcomparative,developmental,andcognitivepsychol-ogy.BehavioralandBrainSciences,21,803-8.Harris,J.W.&Stocker,H.(1998).Handbookofmathe-maticsandcomputationalscience.NewYork,NY:Springer-Verlag.
Holland,J.H.,Holyoak,K.J.,Nisbett,R.E.&Thagard,P.R.(1986).Induction:processesofinference,learninganddiscovery.Cambridge,MA:MITPress.
Howell,D.C.(1997).Statisticalmethodsforpsychology(4thedition).NewYork:Duxbury.(FrenchtranslationbyM.Rogier,M´ethodesstatistiquesenscienceshumaines,Brux-elles,DeBoeck,1998).
Johnson,D.(1990).Acatalogofcomplexityclasses,inVanLeeuven(Ed),Handbookoftheoricalcomputerscience(pp.67-161).Amsterdam:ElsevierSciencePublishers,67-161.
Kendall,E.A.,Malkoum,M.T.,&Jiang,C.H.(1995).AmethodologyfordevelopingAgentBasedSystems,inC.Zhang&D.Lukose(Eds),FirstAustralianWorkshoponDis-tributedArtificialIntelligence.
Kolmogorov,A.N.(1965).Threeapproachesfordefiningtheconceptofinformationquantity.InformationTransmis-sion,vol1,pp3-11.
Lassaigne,R.,&DeRougemont,M.(1996).Logiqueetcomplexit´e.Paris:Herm`es.
Levine,M.(1966).Hypothesisbehaviorbyhumansdur-ingdiscriminationlearning.JournalofExperimentalPsy-chology,71,331-338.
Mathy,F.&Bradmetz,J.(1999).Unmod`elemulti-agentdelacomplexit´econceptuelle.InM.P.Gleizes&P.Marcenac(Eds.),Ing´enieriedessyst`emesmulti-agents:actedes7eJourn´eesFrancophonesd’IntelligenceArtificielleetSyst`emesMulti-Agents.Paris:Herm`es.(pp.343-345).Mathy,F.(2000).L’apprenabilit´edesconceptse´valu´eeaumoyend’unmod`elemulti-agentdelacomplexit´edescom-municationsenm´emoiredetravail.Unpublisheddoctoraldissertation,Universit´edeReims,France.
McClelland,J.L.&Rumelhart,D.E.&PDPResearchGroup.(1986).Paralleldistributedprocessing(2vol).Cam-bridge:MITPress.
Minsky,M.L.(1985).Thesocietyofmind.NewYork,NY:SimonandSchuster(Frenchtranslation:Lasoci´et´edel’esprit.Paris:InterEditions,1988)M¨uller,J.P.(1996).Thedesignofintelligentagents.Lec-turesnotesinartificialintelligence,Volume1177.NewYork,NY:Springer-Verlag.
Newell,A.,&Simon,H.A.(1972).HumanProblemSolv-ing.EnglewoodCliffs,NJ:PrenticeHall.
Newell,A.(1983).Productionsystems:Modelsofcontrolstructures.InW.G.Chase,(Ed),Visualinformationprocess-ing.NewYork,NY:AcademicPress.
Newell,A.(1990).Unifiedtheoriesofcognition.Cam-bridge,MA:HarvardUniversityPress.
Newell,A.(1992).SOARasanunifiedtheoryofcogni-tion:issuesandexplanations.BehavioralandBrainScience,15,4-492.
Nozaki,A.&Anno,M.(1985).Anno’shattricks.NewYork:PhilomelBooks.(Tr.Fr.,Lejeudeschapeaux.Paris:Flammarion,1991).
COMPLEXITYOFCONCEPTSANDLEARNABILITY
11
Pascual-Leone,J.(1970).AmathematicalmodelforthetransitionruleinPiagetsdevelopmentalstages.ActaPsycho-logica,32,301-345.
Quinlan,J.R.(1986).Inductionofdecisiontrees.Ma-chinelearning,1,81-106.
Ruth,M.,&Ryan,M.(2000).Logicincomputerscience.Cambridge,UK:UniversityPress.
Selfridge,O.(1959).Pandemonium:aparadigmforlearning.InSymposiumontheMechanizationofThoughtProcesses.London:HMStationeryOffice.
Shepard,R.N.,Hovland,C.L.,&Jenkins,H.M.(1961).Learningandmemorizationofclassifications.PsychologicalMonographs,75,n13,1-42.
Steiger,J.H.(1980).Testsforcomparingelementsofacorrelationmatrix.PsychologicalBulletin,87,245-251.V´elu,J.(1999).M´ethodesmath´ematiquespourl’informatique.Paris:Dunod.
Zelazo,P.D.,Frye,D.&Rapus,T.(1996).Anaged-relateddissociationbetweenknowingrulesandusingthem.CognitiveDevelopment,11,37-63.
ReceivedJune11,2003AcceptedNovember24,2003
APPENDICES
Appendix1.Multi-AgentModels
Multi-agentmodelsarecollectiveproblem-solvingmeth-ods.Althoughthisideaisrelativelyoldinpsychology(Min-sky,1985;Selfridge,1959),itwasnotuntilrecentlythatsim-ulationsofagentsocietiesandtheirevolutionincomputerscienceweredeveloped(Brazier,Dunin-Keplicz,Jennings,&Treur,1995;Burmeister&Sundermeyer,1990;Crabtree&Jennings,1996;Epstein&Axtell,1996;Ferber,1999;Gilbert&Conte,1995).Despiteanumberofattemptstodevisegeneralmodels(Ferber&Gutknecht,1998;Kendall,Malkoum,&Jiang,1995;M¨uller,1996),therehasbeennoarchitecturalstandardization.Multi-agentsystemsdrawtheirinspirationfromMinsky(1985),whodevelopedtheideaofasocietyofmind,accordingtowhichamindcanbecon-structedfromnumeroussmallpartseachofwhichismind-less.Accordingtothisprinciple,inamulti-agentsystem,competenceisnotcentralizedbutdistributedamongdiffer-entagentswhocommunicatewitheachother.Thekeyno-tionsareusuallycollaboration,competition,communication,andself-organization.Thesemodelshavebeenapplied,forinstance,tomodellingeconomicandsocialproblems(Axel-rod,1997),andarticlesdevotedtothemareabundanttodayinjournalsofcomputersciencetheory.
Inthemodelproposedhere,weassumethateachdimen-sionofaconceptisidentifiedbyasingleagent,andthatinformationmustbeexchangeduntiltheexamplepresentedcanbeidentifiedasapositiveornegativeexample.
Thegeneralproblemisoneofgoingfromdistributedknowledgetocommonknowledge(Fagin,Halpern,Moses,
&Vardi,1995).Basedonthechoiceofafewbasicpropertiesrelatedtohowworkingmemoryfunctions,weshallpresentthreeversionsofthemodel:parallel,serial,andrandom.Letusstartfromthefollowingassumptions:
1.Eachagenthasinformationaboutabinarydimension(sothereareasmanyagentsastherearedimensions)andknowsthecompositionofthepositiveandnegativesubsetsofexamples.If,forinstance,asmallredcircleispre-sented,thesizeagentknowssmall(notbig),thecoloragentknowsred(notblue),andtheshapeagentknowscircle(notsquare).Buteachagentisunawareofwhattheothersknow,whichmeansthatfullknowledgeoftheconceptisdistributedamongthem.
2.Agentstaketurnsmakingtheinformationtheyhavepublic.Theprocessendswhenthepubliclysharedinfor-mationsufficesforsomeonewhoknowsthecompositionofthepositiveandnegativesubsets(i.e.,theconcept)–andthisisthecaseforallagents–toassigntheconcernedexemplartotherightsubclass.
3.Ascommonknowledgeisbeingbuilt,speakingturnsareassignedonthebasisofanentropycalculation,whichenablesagentstocomparetheamountsofinformationtheyarecapableofcontributing(seeQuinlan,1986,foramethodofcalculatingentropysuitedtotheconstructionoftreesthatminimizetheinformationflow).Ifafixedrankforcommu-nicatinginformationissetforeachagent,identifyingtheex-amplewouldamounttofindingthepathinadecisiontreecalledanOBDDororderedbinarydecisiondiagramifthedimensionsareBoolean(seeBryant,1986;Ruth&Ryan,2000).Thisoptionwillbechosenbelowwhenwedeveloptheserialmulti-agentmodel.Theoriginalityoftheparallelmulti-agentmodel(whichwillbedescribedtothegreatestextentinthisstudy)liesinthefactthatspeakingturnstakenbyagentstoreleasetheirpartialknowledgearenotcontrolledaprioributarecalculatedateachoccurrence.Whenanagentreleasesapieceofinformation,itknowshowmuchitscon-tributionreducestheuncertainty,sinceitknowsthetargetsubclassesofpositiveandnegativeexamplesoftheconceptanditalsoknowswhatsubset(andthepositiveandnegativeexamplesitcontains)itisleavingfortheagentsthatfollowit.Takeathree-dimensionalspacewhosedimensionsareshape(hereafterlabeledFforform,CforcolorandSforsize).Ifthesubclassofpositiveexamplesincludesallsmallroundexamples,smallredexamples,andredsquareexamples(seeFigure1),andasmallroundblueexampleispresented,theshapeagent,bystating“round”cutstheuncertaintyinhalf(twopiecesofinformationsufficetoidentifytheexampleanditgaveone;theotherwillbesize).Butthecoloragentwillonlyreducetheuncertaintybyathirdsincetheothertwowillhavetoexpressthemselvesafterhedoes.Inthisexample,theshapeagentorthesizeagentcanthusdeclareagreaterreductioninuncertaintythanthethirdagentcan,andoneofthem(say,drawnatrandom)willspeakfirst.Turn-takingisthusdeterminedbytheinformationfurnishedandnotbythetypeofdimensionevoked.Thus,foragivenconcept,differ-entexamplescangiverisetodifferentspeakingturns.If,fortheconceptdescribedabove,thesmallredsquareexampleispresented,theshapeagentwillstillspeakfirst,butthis
12
¨BRADMETZFABIENMATHY∗ANDJOEL
time,itisthecoloragentandnotthesizeagentwhospeaks
insecondplace.Theidentificationofexamplesinthecaseofpre-determinedspeakingturnswouldbewrittenasfollows:
F∧(S∨C)
Nowwritingthisintermsofspeakingturnswithmaximalinformativeness,weget
X∧Y
whereX,Y,etc.aretheagents(ofanykind)whospeakfirst,second,etc.Inotherwords,nomatterwhatexampleispresented,twopiecesofinformationwillsufficetoassignittothecorrectsubclass.Thespeakingorderoftheagentsthusdependsupontheamountofinformationcontributed:themostinformativespeaksfirst(or,incaseofatie,oneofthemostinformative),andsoon.Tomakethisprocessmoreconcrete,imagineacardgamewhere,beforearound,eachagentstateshowmuchhewillreducetheuncertaintybymak-ingabidorlayingdownacard.Wecallthismodelparallel,notbecausetheagentstalkatthesametimebutbecausetheysimultaneouslyandindependentlycalculatehowmuchtheycaneachreducetheentropy.
4.Duringknowledgebuilding,silencesarenotinterpretedinexampleassignment.Supposethepositivesubclassisbigsquarefigures,bigbluefigures,andbluesquarefigures.Theexamplepresentedisabigredroundfigure.Byannouncing“big”,thesizeagentmakesitimpossiblefortheothertwoagentstoindividuallygiveadecisivepieceofinformation.Iftheymutuallyinterprettheirsilence(asinthe“GameofHats”;seeNozaki&Anno,1991),theywouldarriveattheconclusionthatitisaredroundfigure.Anotherimportantfeatureofthemodelisthemandatorydissociationbetweentheinformationanagentthinksitwillcontributeandthein-formationitactuallydoescontribute.Sometimesthetwoco-incideandsometimesuncertaintyremains(asintheaboveexample).Beforesystematicallydevelopingthispoint,letustakeanotherexamplefromtheconceptinFigure1.Ifasmallredroundexampleispresented,theshapeagentknowsthatonespeakingturnwillsufficeafterhisown.If,forthatexam-ple,thecoloragenthadspokenfirst,thenitwouldbeuncer-tainaboutitsrealcontribution:whenitseesred,itcansaytoitself(rememberthateachagentknowsthetargetsubclasses)thatiftheexampleissmallorsquare,asinglespeakingturnwillsuffice,butifitisbigandround,thentwospeakingturnswillbenecessary.Itcannotinfactknowexactlyhowmanyagentswillhavetospeakaboutaparticularexampleafteritdoes;itcanonlyknowthisintermsofanexpectancyoveralargenumberofspeakingturns.
Appendix2.RemovingUncertainty
Withonedimension,theuncertaintyiscompletelyandim-mediatelyremovedbecausetheagentincontrolofthedi-mensionpossesses100%oftheinformation.Withtwodi-mensions,thereareseveralpossiblecaseswhosedistributionisindicatedinFigure8.IncaseC,asingleagent,hereafter
Figure8.Uncertaintycaseswith2Dconcepts.Note.Identifica-tionachieved(nospeakersnecessary).C:Choice(onespeakersuf-fices).SandS:simpleinteraction(oneortwospeakersnecessary,dependingonthecase).D:dualinteraction(twospeakersalwaysnecessary).
denotedX,sufficestoremovetheindetermination.(Remem-berthatlabelsareassignedtospeakingturns,notparticu-laragents:Xisnotthecoloragentortheshapeagentbuttheagentwhospeaksfirst,becauseitistheone(oramongtheones)whoprovidesthemostinformation.)IncaseD,whetherapositiveornegativeexampleisatstake,twoagentsmustspeakup:XandY.IncaseS,identifyingoneofthethreepositiveexamplesrequiresonlyoneagentbecausethethreeexamplesareinadisjunctiverelation(e.g.squareorred).Ontheotherhand,identifyingthenegativeexamplerequirestheparticipationoftwoagents(asintheaboveblueroundexample).IncaseS,wethereforesometimeshaveX(3/4)andsometimesX∧Y(1/4).Thisisaparticulardis-junctionsinceitisX∨YwithXalwayspresent.Usingthenotationofpropositionallogic,thiscaseiswrittenX[Y](affirmativeconnectiveofX,forallY)whosetruthtableis:
XX[Y]Y1111100010
0
0
CaseShasexactlythesamestructureascaseS,exceptforthefactthattheproportionsofpositiveandnegativeex-amplesarereversed.CaseIistrivial-noinformationisnec-essary.CaseDisacompleteinteractionbecausethedecisioncannotbemade,regardlessofwhatoneagentresponds,un-lessheknowstheotheragent’sresponse.CasesSandSarepartialinteractionsbetweenthetwodimensions.Thus,whenjusttwodimensionsareconsidered,thereareonlythreeulti-mateformsofinter-agentcommunication.Theyarewritten:
X;X[Y];X∧Y
Wheneveradditionaldimensionsareconsidered,theywillbeeithernecessaryoroptional.Onecanthusdeducethatinaspacewithnbinarydimensions,aninter-agentcommunica-tionprocessthatprogressesfromimplicitdistributedknowl-edgetoenoughcommonknowledgetoassignanexampletothecorrectsubclassisanexpressionthatonlysupportstheoperators∧(necessary)and[](optional).
Thesixteenbinaryoperatorsofpropositionallogiccanberelated(withonerotation)totwo-dimensionalconcepts,asindicatedinFigure9.
Piaget’sINRCgroup,ortheKleinfour-groupisagroupof4elementsthatcombinestwocyclical,2-elementgroups.
COMPLEXITYOFCONCEPTSANDLEARNABILITY
13
Figure9.Conceptsandpropositions.
Table4
Conceptualformsuptofourdimensions.D1D2D3D4XX[Y]X[Y[Z]]X[Y[Z[W]]]X[Y[Z∧W]]X[Y∧Z]
X[Y∧Z[W]]X[Y∧Z∧W]X∧Y
X∧Y[Z]X∧Y[Z[W]]X∧Y[Z∧W]X∧Y∧Z
X∧Y∧Z[W]X∧Y∧Z∧W
Ifweaddathird2-elementgroup,weobtainan8-elementgroup,calledtheextendedINRCgroup(Apostel,1963).Theextendedgroupoperateswithinthesixteenoperatorsofpropositionallogic,withthesamecategoriesasinFigure9.Onecanunderstandthisforgeometricalreasons(Kleinfour-groupoperatorsaresimpleorcombinedrotationsinthiscase).
Therelevanceofthenotationintermsofnecessaryandoptionaldimensionscaneasilybeverified.Forinstance,foralljunctionsandallimplications,itmaysufficetohaveasinglepieceofinformation(e.g.,pfalseforp∧q;ptrueforp∨q;qtrueforp↓q;qfalseforp←q,etc.),butitisnec-essarytohavetwopiecesinhalfofthecases(thiscaneasilybeseenbyinvertingthetruthvaluesintheaboveexamplesinparentheses).
Startingfromthesethreebasicexpressionsintwodimen-sions,eachformulaisobtainedbyaddinganewagent,con-nectedbyoneofthetwooperators.Thisisarecursiveconcept-buildingprocessthatjustifiesthename“graceful”complexification.Table4givesthedifferentformsuptofourdimensions.
Theformulaofaconceptcondensesthesetofoperationsapplicabletoeachpositiveandnegativeexampleofthecon-cept.Itiseasytofindthemandlistthemindisjunctiveform.Forinstance,forX[Y∧Z[W]],weget:
X∨(X∧Y∧Z)∨(X∧Y∧Z∧W)
Figure10.Rotationsandenantiomorphisms.
Inotherwords,eachexamplerequiresthecontributionofei-therasingleagent,orofthreeorfouragents.Anotherexam-pleis:
X[Y[Z[W]]]≡X∨(X∧Y)∨(X∧Y∧Z)∨(X∧Y∧Z∧W)
Appendix3.CountingandCharacterizingCon-cepts
Beforecomingbacktohowourmulti-agentsystemworks,letusmakeafewremarksaboutconceptswithuptothreeandsometimesfourdimensions.Notethatthefourthdimen-sionisgenerallytheupperlimitofhumanworkingmemory,notbecausetheboundarybetween3and4orbetween4and5isdecisive,butbecauseonemustconsidertheentiresetofrelationsbetweentheelementsinworkingmemory.SoitisnotaloadEofelementsthathastobeconsideredbutaloadofP(E),thatis,allsubsetsofnelements,sinceeverysubsetalsoconstitutesanexampleoftheconcept.Threeelementsgeneratealoadof8,4aloadof16,5aloadof32,etc.Thisapproachhasnowbeenadoptedbyauthorsdesirousofrecon-cilingPiaget’stheoryandinformation-processingtheory(theso-calledneo-Piagetiantrend)andwho,followingPascual-Leone’s(1970)seminalwork,developedvariousmodelsre-volvingaroundthenew“magicalnumber”four(Case,1985;Cowan,2001,thoughnotneo-Piagetian;Fischer,1980;Hal-ford,Wilson,&Phillips,1998).Wefindthisnaturallimita-tioninmanyhumanactivities,suchasthefourmainphrasesofasentence,thefoursingingvoices,thefoursuitsinadeckofcards.
Everyconceptpossessesnumerousrealizationsinitsspacethatareequivalentsaveonetransformation(rotationsandenantiomorphisms,Fig.10)andonesubstitutionofthesubclassesofpositiveandnegativeexamples(e.g.thesetofsmallroundandredsquareexamplesisequivalenttothesetofbigroundandbluesquareexamplesinsofarasthesamesubclassesareopposed).Forinstance:
Thus,manyconceptsareequivalent.Table5givesacountofconceptsuptofourdimensions.
Figure11givesacountoftheequivalentformsforthreedimensions.Eachconceptislabelledwithanidentificationnumber(inboldface),whichwillbeusedintheremainderofthisarticle.
Table6givesacountofthedifferentformsinthreedi-mensions.Itdoesnotbringoutanynotableregularities.
Appendix4.Multi-AgentSystemandTerminatingtheIdentificationProcess
Todescribethefunctioningofthemulti-agentsystemingreaterdetail,wedevisedatablethatexplainsthespeaking
14
¨BRADMETZFABIENMATHY∗ANDJOEL
Table5
Numberofdifferentconceptsuptofourdimensions.
Numberofpositiveexamples.Dimensions123456711212131336331414619275056Figure11.Equivalentformsinthreedimensions.
turns,andwepresentsomeillustrationsfortheparallelver-sion.Table7showsConcept12.Therowscontaintheex-amplesoftheverticesofacube,numberedasinFigure1B,whichwillbeusedhereforallcasespresented(thepositiveexamplesofConcept12are1,5,and6).
Thefirstthreecolumnsgivethesituationtheagentinthatcolumnleavesfortheothertwoagentsifhespeaksabouttheexampleinthatrow.Forthreedimensions,thefirstspeakerleavesasquarewhichcanonlyhavefourforms,aswesawabove:identificationachieved(0speakers)denotedI(thoughitisnotusedinthepresenttables),dualinteraction(2speak-ers)denotedD,simpleinteraction(1or2speakers)denotedS,andchoice(1speaker)denotedC.Theambiguouscaseisthesimpleinteraction,wheretheagentwhotalksdoesnotknowsexactlywhatreductionofuncertaintyhebrings(but
10
11
12
13
14
15
7456
5027191
Table6
The13conceptsinthreedimensions.
Numberofpositiveexamples.Formula1234
X∗X[Y]
∗X∧Y
∗∗X[Y[Z]]
∗X∧Y[Z]
∗∗∗∗∗X[Y∧Z]
∗∗X∧Y∧Z
∗onlyknowsitinaprobabilisticmanner).Foreachexample
inthetable,afterthesimpleinteraction(S),therealsituationisshowninbrackets(DorC).Itisnormalthatanagentwholeavesasimpleinteractionexpressesitselfbeforeanagentleavingadualone,althoughhisactualuncertaintyreductionwillnotbegreaterinoneoutoffourcases.TheTable7allowsustowritetheidentificationformula:X∧Y[Z]4.Thisformulameansthatthecontributionoftwoagentsisalwaysnecessary,andthatitissufficientonlyforfouroftheeightexamples.Fourtimesthen(indexofZ),thethirdagentwillhavetospeak.ThesameprinciplesapplytoConcept11(Ta-ble8).
Letuslookindetailatafinalcase,Concept10(Table9).Thesituationisveryrevealingabouttheuncertaintyeffectsgeneratedbyasimpleinteraction.Inallcases,itwouldbepossibletoidentifytheexamplewithonlytwoagents.Yetitisimpossibleforthesystemtoknowthisattheonset,andallagentsareinthesamesituation,thatofleavingasimpleinteraction,with2/3leadingtoachoiceand1/3leadingtoadualinteractiononexamples1,3,4,5,6,and8.Allinall,since6x1/3=2,theformulaisX∧Y[Z]2.Althoughthiscaseisdifferentfromtheprecedingones,themselvesdiffer-entfromeachother,theirformulasareidentical(disregardingtheindexincertaincases).Wewillsaythattheyareisotopesforthepurposesofourclassification.
Concept10isexemplaryoftheambiguitiesthatarisewhenwegofromastateofdistributedknowledgetoastateofcommonknowledgeinacommunityofagents.Tobetterillustratethisambiguity,letusborrowanexamplefromFa-gin,Halpern,Moses,andVardi(1995),whogavetheexam-pleofthreewiveslikelytobecheatedonbytheirhusbandsbuteachonehavingknowledgeonlyofthemisfortuneoftheothers,shouldthecasearise.ThestateoftheworldisA+B−
COMPLEXITYOFCONCEPTSANDLEARNABILITY
15
Table7
CommunicationtablefortheexamplesofConcept12(ex+=1,5,6).
SizeColorShapeSpeaks(S)(C)(F)1st
1DS[D]DC2DDDS∨C∨F3S[C]S[C]DS∨C4DS[C]S[C]C∨F5S[D]DDS6DDS[D]F7S[C]S[C]S[C]S∨C∨F8S[C]DS[C]S∨F
Speaks2ndS∨FS∨C∨FS∨CC∨FC∨FS∨CS∨C∨FF∨S
Speaks3rdF∨SS∨C∨F
∗∗∗∗∗∗∗∗
F∨CC∨S
∗∗∗∗∗∗∗∗
Note.Thistablegivesthesituationafterthefirstspeakingturn.Asimilartableisneededforthesecondturn;thereadercaneasilyguesswhatitwouldbe.
Table8
CommunicationtableforConcept11(ex+=1,2,6,7).
SizeColorShape(S)(C)(F)
1S[C]S[D]C2S[C]CC3S[C]DC4S[D]S[D]S[D]5S[C]CC6S[C]CD7S[D]DD8S[C]CD
Speaks1stFC∨FF
S∨C∨FC∨FCSCSpeaks2ndSSS
S∨C∨FF∨CSC∨FS
Speaks3rd
∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗
S∨C∨F
F∨C
Note.Thistablegivesthesituationafterthefirstspeakingturn.Asimilartableisneededforthesecondturn;thereadercaneasilyguesswhatitwouldbe.
C+(i.e.,AandChaveunfaithfulhusbands).Inthissystem,wecandescribethefollowingtwolevelsforknowledgeKn=“Thereisatleastonewifewhosehusbandisunfaithful”:1.Mutualknowledge:
∀i,j,k∈E=(A,B,C),iknowsKn,andknowsthatjknowsKnandthatjknowsthatkknowsKn.2.Sharedknowledge:
∀i∈E=(A,B,C),iknowsKn.Eachagentknowsthatthereisatleastone+butdoesn’tknowthattheothersknow.Thepresentexamplestabilizesinthisstateifnoinformationiscommunicated.Individually,bywayofherperceptionofthesituation,eachofthethreewivesconsiderstheproposition“Thereisatleastone+”tobetrue.Butshecannotinferanythingaboutwhattheothersknow.AcouldverywellthinkthatBonlyseesone+(ifAthinkssheis−herself),thatCseesno+’sandthatshebelievestherearenone(ifAthinkssheis−herselfandassumesthatCdoeslikewise).Weendupwithaconsiderabledistortionbetweenthestateoftheworld(A+B−C+)andtheattributionofA’sbelieftoC(A−B−C−).
wayshaveanupperboundandalowerbound(Davey&Priestley,1990).Letaandbbetwoelements,theupperbound(a∪b)isthesupremumofthesetwoelements.Rea-soninginthesamewayforthelowerbound,theinfemumofaandbis(a∩b).Morespecificallyintheframeworkofformalconceptualanalysis,twolatticesaremergedtoformpairsandthisgivesaGaloislattice(Ganter&Wille,1991).Eachmulti-agentformulaisseenasapair(A,B)suchthatAisthesetofconceptslearnablefromacommunicationfor-mula(definitioninextension)andBisthesetofconstraintsimposeduponaformula(definitioninintension).Inthelat-tice,eachformulaassignedtoalocationcanlearnallsubor-dinateconcepts.Forsimplicity’ssake,acrossfromeachfor-mula,wegiveonlythemostcomplexconceptlearnablefromit.Theprocessingcostisincurredwhenequivalentcommu-nicationstructures(isotopes)processdifferentconcepts(e.g.Concepts8,9,10,11,and12forthestructureX∧Y[Z]n).Inthiscase,thestructureoccursseveraltimeswithdifferentcostindexes.Thecriterionthatordersthestructuresis:
(A1,B1)<(A2,B2)≡A1⊆A2≡B2⊆B1
Thus,acomplexcommunicationprotocolenableslearn-ingofmoreconceptsthanaprotocolthatissubordinatetoit.Themoreconstraintsoneimposesonacommunicationprotocol(feweragents,lesscommunication),thesmallerthenumberofconceptstheprotocolcanlearn.Inversely,the
Appendix5.GaloisLattices
Acomplexityhierarchyforconceptsuptothreedimen-sionscanbeproposedbasedontheorderingofmulti-agentformulasinaGaloislattice(Figure12).
Alatticeisanorderedsetinwhichanytwoelementsal-
16
¨BRADMETZFABIENMATHY∗ANDJOEL
Table9
CommunicationtableforConcept10(ex+=1,2,5,6).
SizeColorShape(S)(C)(F)
1S[C]S[D]S[C]2S[C]S[C]S[C]3S[C]S[C]S[D]4S[D]S[C]S[C]5S[D]S[C]S[C]6S[C]S[C]S[D]7S[C]S[C]S[C]8S[C]S[D]S[C]
Speaks1st
S∨C∨FS∨C∨FS∨C∨FS∨C∨FS∨C∨FS∨C∨FS∨C∨FS∨C∨FSpeaks2nd
S∨C∨FS∨C∨FS∨C∨FS∨C∨FS∨C∨FS∨C∨FS∨C∨FS∨C∨F
Speaks3rd(S∨F)∗∗∗∗
(S∨C)(C∨F)(C∨F)(S∨C)
∗∗∗∗
(S∨F)
Note.Thistablegivesthesituationafterthefirstspeakingturn.Asimilartableisneededforthesecondturn;thereadercaneasilyguesswhatitwouldbe.
fewertheconstraints,themoreaformulaisabletolearnnumerous,complexconcepts(eveniftheavailableagentsorcommunicationsbecomeuselessforsimplerconcepts).Startingfromthetopofthelattice,wefindaconcept(A,B)suchthatBdoesnotcontaintheconstraintpresentintheconceptbelowit(e.g.,comparedtotheformulaX∧Y∧Z,weimposeonly4callsonZfortheconceptX∧Y[Z]4).Thisprocessisrepeatedinatop-downmanneruntilallconstraintshavebeenused(wethenobtainaunaryagentialprotocol:X).Theconceptsaretreatedinthesamewayviaabottom-upprocess:movingupthelattice,wegraduallyaddthelearn-ableconcepts(thismeansthattheformulaX∧Y∧Zpermitslearningofallconceptsinthreedimensions).Thislatticeprovidesatheoreticalconcept-complexityorder.Ananalogycanbedrawnwiththecomplexitiesdescribedintheintroduc-tionandusedincomputersciencetheory.Thetotalnumberofagentsinaformulacorrespondstothemaximalcompres-sionofthegroupofspeakers.ThisnumberislikerandomChaitin-Kolmogorovcomplexity;itisthecardinalofthesetofallnecessaryindependentagents.Then,duringtheiden-tificationofallexamplesoftheconcept,theagentswillbecalledacertainnumberoftimes.ThisreuseofagentscanbelikenedtoBennett’slogicaldepth4.
RememberthatChaitin-Kolmogorovcomplexitycorrespondstothesizeofthesmallestprogramcapableofdescribing(comput-ing)anobject.Thissizecanbemeasuredintermsofinformationquantity.Now,Bennett’slogicaldepthcorrespondstothecomputa-tiontimeofthissmallestprogram.Inotherwords,itisproportionaltothenumberoftimesthedifferentroutineswillbecalled.ATower
4
COMPLEXITYOFCONCEPTSANDLEARNABILITY
17
Figure12.Galoislatticeofconceptsintheparallelversion.
因篇幅问题不能全部显示,请点此查看更多更全内容
Copyright © 2019- igat.cn 版权所有 赣ICP备2024042791号-1
违法及侵权请联系:TEL:199 1889 7713 E-MAIL:2724546146@qq.com
本站由北京市万商天勤律师事务所王兴未律师提供法律服务