Semantic Web

March 20, 2018 | Author: Anonymous | Category: N/A
Share Embed


Short Description

Download Semantic Web...

Description

The Semantic Web Week 20: Agents that can plan and learn.. Module Website: http://scom.hud.ac.uk/scomtlm/chs2533 Practical this week:

Today 1.

2.

Relationship between Generative Planning and OWL-S More on reasoning with actions

Rest of Course:

Weeks 21 - 23 SW Applications and current activity: European Projects + Information Management

Recall OWL-S – upper level ontology OWL-S is a language for describing web services. It is built from DAML-S and is written in OWL. There are 3 parts to a web service specification:

Service

presents

ServiceProfile

supports describedby

ServiceModel

ServiceGrounding

OWL-S 

The Service Model contains a model of the PROCESS of the Service – and a concrete definition of Inputs, Outputs, Preconditions, Effects

op( getMoney(Bank,20), [ ], [ssc(Bank, [ have_password(Bank,P), balance(Bank,A), ge(A,20), is(AZ,A-20) ], [have_password(Bank,P), balance(Bank,AZ) ] ), ssc(M, [haveresource(M,Y), is(YZ,Y+20) ],[haveresource(M,YZ)] ) ], [ ]).

Input Bank = ebank Preconditions = have_password(ebank,P), balance(ebank,A), haveresource(M,Y), ge(A,20), is(AZ,A-20) is(YZ,Y+20)

Effects/Outputs = have_password(Bank,P), balance(Bank,AZ), haveresource(M,YZ) ¬ balance(ebank,A), ¬haveresource(M,Y),

Planning program example task2 :- startOCL([ se(ticket,[have_ticket(ticket,apollo,cats)]), GOALS se(dvd2, [have_dvd(dvd2,amazon,terminator)]), se(dvd1, [have_dvd(dvd1,amazon,star_wars)]) ], [ ss(ebank, [have_password(ebank,abcd),balance(ebank,100)]), ss(money, [haveresource(money,10)]), ss(service,[logged_off]), ss(apollo, [service(apollo),freeseat(apollo,cats),seat_price(apollo,25)]), ss(amazon, [ service(amazon), dvd_price(amazon,terminator,10), “STATIC” instock(amazon, terminator), KNOWLEDGE instock(amazon, star_wars), dvd_price(amazon,star_wars,15)]), ss(dvd1, [have_no_dvd(amazon,star_wars)] ), ss(dvd2, [have_no_dvd(amazon,terminator)] ), ss(ticket, [have_no_ticket(apollo,cats)]) ] ).

Operator Schema op( buy_dvd(DVD,Seller,Prod,Price), [se(Seller, [instock(Seller,Prod),dvd_price(Seller,Prod,Price)]), se(S,[logged_on(S,Seller)] ], [ssc(M, [haveresource(M,Y), ge(Y, Price), is(YX,Y-Price)],[haveresource(M,YX)]), ssc(DVD, [have_no_dvd(Seller,Prod)],[have_dvd(DVD,Seller,Prod)]) ], [ ]). op( book_theatre_seat(Ticket,Seller,Prod), [se(Seller,[freeseat(Seller,Prod),seat_price(Seller,X)]), se(S,[logged_on(S,Seller)]) ], [ssc(Money, [haveresource(Money,Y), ge(Y, X), is(YX,Y-X)],[haveresource(Money,YX)]), ssc(Ticket, [have_no_ticket(Seller,Prod)],[have_ticket(Ticket,Seller,Prod)]) ], [ ]). op( getMoney(Bank,20), [ ], [ssc(Bank, [ have_password(Bank,P), balance(Bank,A), ge(A,20), is(AZ,A-20) ], [have_password(Bank,P), balance(Bank,AZ) ] ), ssc(M, [haveresource(M,Y), is(YZ,Y+20) ],[haveresource(M,YZ)] ) ], [ ]).

op( logon(Seller), [se(money,[haveresource(money,R),gt(R,0)]), se(Seller,[service(Seller)]) ], [ssc(service, [logged_off],[logged_on(service,Seller)]) ], [ ]). op(logoff(Seller), [ ], [ssc(service, [logged_on(service,Seller)],[ logged_off ]) ],

[ ]).

Dynamic Knowledge

Execution Example | ?- task2. .................. goal [se(ticket,[have_ticket(ticket,apollo,cats)]), se(dvd2,[have_dvd(dvd2,amazon,terminator)]), se(dvd1,[have_dvd(dvd1,amazon,star_wars)])] achieved by sequence [getMoney(ebank,20),getMoney(ebank,20), logon(apollo),book_theatre_seat(ticket,apollo,cats),logoff(apollo), logon(amazon),buy_dvd(dvd2,amazon,terminator,10),buy_dvd(dvd1,amazon,star_wars,15)] 699 nodes generated 4.51 seconds of cpu 154.9889135254989 nodes search per second new state is [ss(dvd1,[have_dvd(dvd1,amazon,star_wars)]),ss(money,[haveresource(money,0)]),ss(dvd2,[ have_dvd(dvd2,amazon,terminator)]),ss(service,[logged_on(service,amazon)]),ss(ticket ,[have_ticket(ticket,apollo,cats)]),ss(ebank,[have_password(ebank,abcd),balance(eban k,60)]),ss(apollo,[service(apollo),freeseat(apollo,cats),seat_price(apollo,25)]),ss(amazo n,[service(amazon),dvd_price(amazon,terminator,10),instock(amazon,terminator),instoc k(amazon,star_wars),dvd_price(amazon,star_wars,15)])]

Planning Algorithm Nodes = (State, Sequence of actions that got to State) 1. Store the first node (initial state + empty solution) Repeat 2. pick a node (State, Sequence) 3. pick an action and parameter grounding - ‘A' - that can be applied to State 4. apply A to State to get State' 5. store new node (State', Sequence + A) 6. if possible, backtrack to 3. and make a different choice. Until a node (State, Sequence) has been asserted such that State contains the goal literals

Applying Planning Actions.. 4. apply A to State to get State‘ …. A = getMoney(ebank,20) op( getMoney(Bank,20), [ ], [ssc(Bank, [ have_password(Bank,P), balance(Bank,A), ge(A,20), is(AZ,A-20) ], [have_password(Bank,P), balance(Bank,AZ) ] ), ssc(M, [haveresource(M,Y), is(YZ,Y+20) ],[haveresource(M,YZ)] ) ], [ ]). Parameter “Bank” => constant “ebank” getMoney(ebank,20) = Precondition = have_password(ebank,P), balance(ebank,A), ge(A,20), is(AZ,A-20) , haveresource(M,Y), is(YZ,Y+20) Post condition = have_password(ebank,P), balance(ebank,AZ), haveresource(M,YZ)

Planning Algorithm: applying actions 4. apply A to State to get State‘ …. EXAMPLE State = [ ss(ebank, [have_password(ebank,abcd), balance(ebank,100)]), ss(money, [haveresource(money,10)]), ss(service,[logged_off]), …. ETC] ). getMoney(ebank,20) = Precondition = have_password(ebank,P), balance(ebank,A), ge(A,20), is(AZ,A-20) , haveresource(M,Y), is(YZ,Y+20) Post condition = have_password(ebank,P), balance(ebank,AZ), haveresource(M,YZ)

Apply A: 1. Are the precondition’s achieved? Yes, with P =“abcd”, A=100, AZ = 80, M = money, Y = 10, YZ = 30 2. Apply Post Conditions State’ = [ ss(ebank, [have_password(ebank,abcd), balance(ebank,80)]), ss(money, [haveresource(money,30)]), ss(service,[logged_off]), …. ETC] ).

Example Agent Architecture WEB ENVIRONMENT

Failure

Plan Execution

Agent Generative PLANNER

High Level Requests Response USER

Feedback

Static Knowledge Base Dynamic Knowledge Base

Learning Component

Summary 





Agents have to have a degree of autonomy to make them useful. They need to be able to reason with actions, time, events, resources. Web Agents will be able to effect things by using Web Services (collect information, supply information, buy goods, organise transactions..) Planning algorithms are useful to allow agents to make plans that achieve goals

View more...

Comments

Copyright © 2017 DOCUMEN Inc.