Project 1-3: Enhancing the RSS Feed Reader

This tutorial continues creating a RSS feed reader in Director. It is intended to demonstrate an 'object orientated' approach to creating applications in Director.

Summary of Parts 1 and 2

In part 1, three basic scripts were created:

  • SubscriptionMgr for managing a list of subscriptions (URLS) and creating a FeedObject for each subscription
  • RSS-Feed for creating 'Feed Objects' which have methods for getting information about a feed, list of article titles and descriptions for specific articles.
  • RSS-Parser for parsing the RSS XML into Director-friendly proplists.

In part 2, a basic framework for a graphical user interface for these scripts was created. In this third part, the framework will be fleshed out as three new features are added to the reader:

  1. Local storage of feed data (so the reader can display information from the last time the feed was updated while it waits for the new data to be downloaded);
  2. Show the 'unread' items in bold (well, we can't actually tell whether the user has actually read an item - but it can record whether or not the item has been displayed);
  3. An automated system for refreshing all subscriptions.

Storing Data

The SubscriptionMgr described in part 1 uses the 'listSaver' script from the Lingoworkshop Xlib to save a list of feeds. There are some good Xtras for storing lists in local files in a binary format (such as PropSave, BuddyFile and vList). Using such an Xtra would be quicker than parsing through a text file (which is what the ListSaver does), but since the RSS Feeder will work in Shockwave and I don't yet own a copy of vList (the only Xtra of the three mentioned that will work in Shockwave), I'll continue to use 'listSaver' for storing data. If we decide to use an Xtra later, it is simply a matter of changing the listSaver script (this illustrates the OOP idea of 'black boxes' - provided the object I am calling 'ListSaver' reads and writes lists in response to appropriate messages, it doesn't really matter how the script is actually implemented).

Storing Feed Data

In part 1, a basic 'FeedObject' script was outlined. This script stored the data that was parsed from a feed' XML. Its new method was like this:

-- script("RSS-Feed")

		  

property URI

property ChannelInfo

property ItemsList

property StatusStr



on new (me, aURL)

  URI = aURL

  

  -- initialise some properties (these will get

  -- populated when the XML has been parsed)

  ChannelInfo = [:]

  ItemsList = []

  

  -- now download the XML 

  StatusStr = "Getting XML"

  aURL = aURL & "?"&random(the milliseconds)

  NetOp = script("NetOp.transaction").new(aURL)

  NetOp.AddListener(me)

  NetOp.Start()

  return me

end

In this next version, we are going to change the script so that rather than starting out with an empty list, it tries to start with a list that was create the last time the feed was updated. So, the new version of this script will look like this (the new code is in blue):

-- script("RSS-Feed")

--@ Version 0.2  

property URI

property ChannelInfo

property ItemsList

property StatusStr

property Cache



on new (me, aURL)

  URI = aURL

  Cache = script("RSS-Feed.cache").new()

  

  -- initialise some properties (these will get

  -- populated when the XML has been parsed)

  ChannelInfo = [:]

  ItemsList = []

  

  -- firstly populate these two lists 

  -- with data from local file (if any)

  CacheData = Cache.GetCachedData(URI)

  if CacheData.ilk = #PropList then 

  	ChannelInfo = CacheData[#ChannelInfo]

  	ItemsList = CacheData[#ItemsList]

  end if

  

  -- now download the XML 

  StatusStr = "Getting XML"

  aURL = aURL & "?"&random(the milliseconds)

  NetOp = script("NetOp.transaction").new(aURL)

  NetOp.AddListener(me)

  NetOp.Start()

  return me

end

The idea here is that if the (yet to be written) 'RSS-Feed.cache' script can return some data for the specified URL, we use that data until the data is refreshed from the remote XML. When we do get updated data from the remote XML source, we save it to the local file. Here's the modified 'NetTransactionComplete' method of the feedObj (the new line of code in blue):

-- script("RSS-Feed") [continued]

on NetTransactionComplete (me,sender, netData)

  if netData.error = 0 then

    -- got the net text, now parse it

    XMLStr = netData.text

    RSSObj = script("RSS-Parser").new()

    parseData = RSSObj.Load(XMLStr)

    

    if parseData.error = 0 then

      -- successfully parsed the XML

      if parseData[#ChannelInfo].ilk = #PropList then ChannelInfo = parseData[#ChannelInfo]

      if parseData[#itemsList].ilk = #List then ItemsList = parseData[#itemsList]   

      StatusStr = "Done"

      script("RSS-Feed.cache").Update(URI, parseData)

    else

      -- error parsing

      StatusStr = "Error parsing the XML feed"

    end if

  else

    -- error getting the xml

    StatusStr = "Error retreiving the XML feed - " & sender.GetErrorDescription(netData.error)

  end if

end 

The RSS-Feed.cache Script

So far, the 'RSS-Feed.cache' script has two public methods: GetCachedData for getting data, and Update for updating the data. You might be thinking "hang on - those two methods are pretty much the same as the "ReadList" and "WriteList" methods of the ListSaver script... so whats the point of having this 'RSS-Feed.cache' script? - why can't the RSS-Feed script just use the ListSaver script to save the lists itself?". The answer to this is that the the code required to associate a feed with a file is a little complex and requires a central 'index' or catalog (for reasons I will discuss). Moreover, I like to keep the role of an object as simple and obvious as possible and adding all the extra code for locating, reading and writing local files to the FeedObj is starting to make the feedObj's role a little less obvious. Having a separate 'cache' script seems much clearer, simpler division of roles and responsibilities. If there are problems and feeds are showing up empty, then the first place to look is the cache script and seeing if data is being written and read.

The reason why the RSS-Feed cache script gets a little complex is that it must somehow know which Feed is associateed with what cache file. This wouldn't be a problem if we were using one of the Xtras and storing all the cached data in a single file (in this case, we would just store all the data in a big PropList which matches each URL to a list of data for that URL). However, we are using the ListSaver script which uses Lingo's setPref and SetPref functions. Since there is probably a limit to how much data we can save in a single file using getPref and SetPref, we are going to store the data for each feed in its own file.

If the list of URLs we have subscribed to never changed, then we could use a simple numerical system for indexing the cache files. For example, if http://www.lingoworkshop.com/rss.xml was the first feed, we could store the downloaded data in a file called 'cache_1.txt', and the next url in 'cache_2.txt'. However, we are going to add the ability for uses to add and remove feeds to their subscription list so we cannot be certain that the item at a specific position in the list is the same as it was last time. So what we want to do is associate the feed URL (which is unique to each feed) with the file name. Ideally, we would use the URL as the file name - or perhaps some compressed version of the URL, but setPref is very fussy about file names (must end in .txt or .html, and the documentation recommends a max of 8 characters). So we are going to create a 'catalog' file which matches URLs to "IDs" and use these IDs to name the files. In pseudo code, the script will look like this:

-- RSS-Feed.cache

on GetCachedData (me, feed_url)

  -- determine the cache file for this feed

  -- read the cache file

  -- parse the data and return the result

end







on UpdateCache (me, feed_url, data)

  -- determine the cache file for this feed

  -- write the data to the cache file

end



To determine which file, the script will use a 'catalog' - which is a PropList pairing URLs with files. For this to work, there needs to be a single catalog that all instances of the cache script can access, so we'll use a 'script object property' (which is a little like a 'class property') .

-- RSS-Feed.cache



-- INSTANCE METHODS



on GetCachedData (me, feed_url)

  Catalog = me._InitialiseCatalog()

  CacheFile = Catalog.getAProp(feed_url)

  -- parse the data and return the result

end



on UpdateCache (me, feed_url, data)

  Catalog = me._InitialiseCatalog()

  CacheFile = Catalog.getAProp(feed_url)

  -- write the data to the cache file

end



-- CLASS METHODS

-- THese are pseudo 'class methods'. The enable all instances of this script

-- to access a shared property (in this case, a property list)





property Catalog



on _InitialiseCatalog (this)

  -- make sure that 'this' is the script object, not

  -- an instance of it

  if this.ilk = #instance then return this.script._InitialiseCatalog()

  else if this.ilk <> #script then return #param_error

  

  -- if catalog not created, then create it now

  if voidP(this[#Catalog]) then

    ListSaver = script("ListSaver").new()

    cat = ListSaver.ReadList("SWFR_CAT.txt")

    if voidP(cat) then

      -- new catalog required

      DefaultCatalog = [:]

      DefaultCatalog[#Version] = "1.0.0"

      DefaultCatalog[#NextIndex] = 1

      DefaultCatalog[#LastUpdate] = the long date & " at " & the long time  

      ListSaver.SaveList(DefaultCatalog, "SWFR_CAT.txt")

    end if

    this[#Catalog] = Cat

  end if

  return this[#Catalog]

end



on _UpdateCatalog (this)

  -- make sure that 'this' is the script object, not

  -- an instance of it

  if this.ilk = #instance then return this.script._UpdateCatalog()

  else if this.ilk <> #script then return #error

  

  -- update the 'last update' property and save to file

  this.Catalog[#LastUpdate] = the long date & " at " & the long time 

  ListSaver = script("ListSaver").new()

  ListSaver.SaveList(this.Catalog, "SWFR_CAT.txt")

end



When some cache data is request from an instance of this script, the first thing this script does is get a reference to the 'catalog' which matches URLs to files. It does this by calling this line:

if this.ilk = #instance then return this.script._UpdateCatalog()

If a catalog hasn't yet been created, it creates a new one. You might notice that in addition to storing URLS and file names, the catalog has a few additional properties: #Version, #NextIndex and #LastUpdate. The version property might come in handy later if we changed the format of the list or the data we save in the cache file. If that happens, we might write some lingo that says 'if Catalog.version = 1.0 then do something differently'. The #NextIndex property simply keeps track of how many cache files have been created. Whenever we need to create a new one, we increment this value.

One possible weakness with this system is that it is dependent on the catalog maintaining its integrity (if the catalog is deleted or corrupted for some reason, and gets re-built, the filenames associated with URLs might change). To make sure the data stored in a cache file actually belongs to the URL in question, we'll also store the URL in the cache data. Here is the final version of the cache script with the completed GetCachedData and Update methods:



-- "RSS-Feed.cache" Parent Script

-- @version 1.0



property URI

property ChannelInfoRef

property ItemsListRef

property CacheFile







on GetCachedData (me, feed_url)

  -- returns the 'Feed propList' which is either empty

  -- or populated by data read from the cache

  

  -- first, get the catlog list from the static script property

  Catalog = me._InitialiseCatalog()

  CacheFile = Catalog.getAProp(feed_url)

  

  rList = [:]

  rList[#ChannelInfo] = [:]

  rList[#itemsList] = []

  

  if voidP(CacheFile) then 

    -- cache not created yet, so create one

    NextIdx = Catalog.NextIndex

    CacheFile = "SWFR"&NextIdx & ".txt"

    Catalog.NextIndex = Catalog.NextIndex + 1

    Catalog.addProp(feed_url, CacheFile)

    me._UpdateCatalog()

  end if

  

  

  ListSaver = script("ListSaver").new()

  parseData = ListSaver.ReadList(CacheFile)

  if parseData.ilk = #propList then 

    

    if parseData[#URL] = feed_url then

      if parseData[#ChannelInfo].ilk = #PropList then  rList[#ChannelInfo] = parseData[#ChannelInfo]

      if parseData[#itemsList].ilk = #List then rList[#itemsList] = parseData[#itemsList]

    end if

    

  else

    put "no cache found"

  end if

  

  return rList

end





on Update (me, feed_url, data)

  -- updates the feed data  stored in the cache

  

  Catalog = me._InitialiseCatalog()

  CacheFile = Catalog.getAProp(feed_url)

  

  if voidP(CacheFile) then 

    -- a cache file has not been created yet for this url

    NextIdx = Catalog.NextIndex

    CacheFile = "SWFR"&NextIdx & ".txt"

    Catalog.NextIndex = Catalog.NextIndex + 1

    Catalog.addProp(feed_url, CacheFile)

    -- 

    me._UpdateCatalog()

  end if

  -- make sure the URL is stored with the data

  data[#URL] = feed_url

  ListSaver = script("ListSaver").new()

  ListSaver.SaveList(data, CacheFile)

end



-- CLASS METHODS

-- THese are pseudo 'class methods'. The enable all instances of this script

-- to access a shared property (in this case, a property list)





property Catalog



on _InitialiseCatalog (this)

  -- make sure that 'this' is the script object, not

  -- an instance of it

  if this.ilk = #instance then return this.script._InitialiseCatalog()

  else if this.ilk <> #script then return #param_error

  

  if voidP(this[#Catalog]) then

    ListSaver = script("ListSaver").new()

    cat = ListSaver.ReadList("SWFR_CAT.txt")

    if voidP(cat) then

      -- new catalog required

      DefaultCatalog = [:]

      DefaultCatalog[#Version] = "1.0.0"

      DefaultCatalog[#NextIndex] = 1

      DefaultCatalog[#LastUpdate] = the long date && "at" && the long time 

      ListSaver.SaveList(DefaultCatalog, "SWFR_CAT.txt")

    end if

    this[#Catalog] = Cat

  end if

  return this[#Catalog]

end



on _UpdateCatalog (this)

  -- make sure that 'this' is the script object, not

  -- an instance of it

  if this.ilk = #instance then return this.script._UpdateCatalog()

  else if this.ilk <> #script then return #error

  

  this.Catalog[#LastUpdate] = the long date && "at" && the long time 

  ListSaver = script("ListSaver").new()

  ListSaver.SaveList(this.Catalog, "SWFR_CAT.txt")

end

Tracking what items have been read

To track what items have been read, the FeedObj is going to add a new property to the itemsList. This property, called 'Read', will default to false, but set to true when the feedObj receives a message to display the item. Since this property is being added to the itemsList, it will be stored in the cache along with all the other data for the feed. Here is version 2 of the 'RSS-Feed' script with the new code hilighted in blue:



-- "RSS-Feed" Parent Script

-- @version 1.1



property URI

property ChannelInfo

property ItemsList

property StatusStr

property Cache



on new (me, aURL)

  URI = aURL

  Cache = script("RSS-Feed.cache").new()

  

  -- initialise some properties (these will get

  -- populated first by the cache, and then update

  -- when the XML has been parsed)

  ChannelInfo = [:]

  ItemsList = []

  

  -- populated these two lists with data from cache

  CacheData = Cache.GetCachedData(URI)

  ChannelInfo = CacheData[#ChannelInfo]

  ItemsList = CacheData[#ItemsList]

  

  -- now download the XML. When the XML has been downloaded,

  -- it will be parsed and used to update the data lists

  StatusStr = "Getting XML"

  aURL = aURL & "?"&random(the milliseconds)

  NetOp = script("NetOp.transaction").new(aURL)

  NetOp.AddListener(me)

  NetOp.Start()

  return me

end



on NetTransactionComplete (me,sender, netData)

  if netData.error = 0 then

    -- got the net text, now parse it

    XMLStr = netData.text

    RSSObj = script("RSS-Parser").new()

    parseData = RSSObj.Load(XMLStr)

    

    if parseData.error = 0 then

      -- successfully parsed the XML

      if parseData[#ChannelInfo].ilk = #PropList then ChannelInfo = parseData[#ChannelInfo]

      if parseData[#itemsList].ilk = #List then 

        -- First, create a copy

        -- of the list we are about to replace

        previousItemsList = ItemsList.duplicate()

        -- Now update the itemslist. 

        ItemsList = parseData[#itemsList]

        -- now add the 'read' property

        repeat with anItem in ItemsList

          anItem.setAProp(#read, false)

        end repeat

        -- now update the value for 'read' from the previous list

        repeat with anItem in previousItemsList

          if anItem[#read] then

            -- this item has been read. Is it still in the list?

            foundItem = me._FindItem(anItem)

            if voidP(foundItem) then

              -- no longer in the itemList

            else

              foundItem.setAProp(#read, true)

            end if

          end if

        end repeat

        

      end if

      

      StatusStr = "Done"

      Cache.Update(URI, parseData)

    else

      -- error parsing

      StatusStr = "Error parsing the XML feed"

    end if

  else

    -- error getting the xml

    StatusStr = "Error retreiving the XML feed - " & sender.GetErrorDescription(netData.error)

  end if

end 



on NetTransactionStatusUpdate (me,sender,data)

  StatusStr = data[#state]

  if StatusStr = "InProgress" then 

    StatusStr = "Downloading" && integer(data[#fractiondone]*100) & "%"

  end if

end



-- Some Accesors for the GUI



on GetStatus (me)

  return StatusStr

end



on GetFeedURL (me)

  return URI

end



on GetChannelInfo (me)

  return channelInfo

end



on GetTitleList (me)

  rList = []

  repeat with anItem in itemsList

    rList.append([anItem[#title], anItem[#read]])

  end repeat

  return rList

end



on GetItemAt (me,p)

  if p = 0 or p > itemsList.count then return #Index_Error

  itemsList[p][#read] = true

  Cache.Update(URI, [#ChannelInfo: ChannelInfo, #itemsList:itemsList, #URL: URI])

  return itemsList[p]

end



on GetUnreadItemsCount (me)

  cnt = 0

  repeat with thisItem in ItemsList

    if NOT(thisItem[#read]) then cnt = cnt + 1

  end repeat

  return cnt

end



-- Private 



on _FindItem (me, anItem)

  -- find 'an item' in the current itemList by comparing it

  -- to another item 

  repeat with thisItem in ItemsList

    if anItem[#title] = thisItem[#title] then

      if anItem[#description] = thisItem[#description] then

        return thisItem

      end if

    end if

  end repeat

end

You may notice a new public method for the Feed-Obj: GetUnreadItemsCount. This method will be used when we display the subscription List to show which feeds have unread items. The GetTitleList has also changed so that now it a list of titles, as well as indicating whether or not the title has been read. We need to change the behaviours that are displaying these lists to reflect the new information. Since we are still creating a bare-bones prototype (we'll work on a fancy GUI soon), will just modify the simple behaviours we have currently been using (download the source movie to see these changes - since these behaviours are just placeholders until we start working on the GUI in more detail, they won't be discussed here).

The Auto-updater

The next script we will create is the subscription auto-updater. This script will create an object who's job is it is to update all the feeds quietly in the backgroud. If you remember from the first part of this tutorial series, the SubscriptionMgr script keeps a list of URLS (the SubscriptionList) and when when a feed is selected, a new feedObject is created for that feed and the feed is added to the OpenFeeds list. If the feed already exists when it is selected (that is, it can be found in the OpenFeeds list), then the previously opened feed is used. What the auto-updater needs to do is grab a copy of the SubscriptionList and then open first unopened feed in that list. When a feed is downloaded and parsed, then the next unopened feed is opened. Ideally, both the auto-updater and the SubscriptionMgr should be using the same OpenFeeds list. This way, if the user has clicked a feed (and therefore started a download for it), when the auto-updater reaches that feed it doesn't bother re-downloading it. This sharing of lists is easily achieved by passing a reference to the list, not a duplicate of it (remember, lists - like other Lingo objects - are passed 'by reference', not 'by value').

Since the SubscriptionMgr.Updater script will need to get refrences to the SubscriptionList and OpenFeeds lists from the SubscriptionMgr object, we will pass a reference to the SubscriptionMgr as a startup parameter. Here is the basic updater script:

-- "SubscriptionMgr.Updater" Parent Script

-- @version 1.0



property SubscriptionsToRefresh

property OpenFeeds

property ActiveFeed

property Status





on new (me, subscriptionMgr)

  SubscriptionList = subscriptionMgr.GetSavedSubscriptions()

  SubscriptionsToRefresh = SubscriptionList.duplicate()

  OpenFeeds = subscriptionMgr.GetOpenFeedsList()

  me._OpenNextSubscription()

  return me

end





on _OpenNextSubscription (me)

  -- check whether there is anything left to check

  if SubscriptionsToRefresh.count = 0 then

    

    return false

  else

    -- get the site from the 'to refresh' list,

	  -- and delete it from the list

    site = SubscriptionsToRefresh[1]

    SubscriptionsToRefresh.deleteAt(1)

    -- have we already created a feedOBj for this Url?

    feed = OpenFeeds.getAProp(site.url)

    if feed.ilk <> #instance then

      -- haven't created it yet

      ActiveFeed = script("RSS-Feed").new(site.url)

      put ActiveFeed.URI && " ->  updating..."

      OpenFeeds.AddProp(site.url,ActiveFeed)

      

      return true

    else

      -- feed has already been opened (possibly

      -- user clicked the link to open it)

      put Feed.URI && " already opened"

      return me._OpenNextSubscription()

    end if

    

  end if

end

Now when a feed has been updated, we will want to update the GUI (for example, if a feed has some new items, we will want to show this both in the feed selector listbox, as well as the list of items (if that feed is currently being viewed). To do this, our either or GUI needs to be able to periodically check whether the feed has changed, or the autoupdater needs to broadcast this change to the GUI. To keep things a littler simpler, we'll take the second approach and get the auto-updater to sendAllSprites a #UpdateSubscriptionList message when a feed has been updated. We'll do this by having the Auto-updater create a timeout object that sends a Timer_CheckProgress message every 100ms back to the auto-updater. The auto-updater responds to this message by checking whether the active feed (ie the one that has just been 'opened') has been fully downloaded. If it has been downloaded, then the next feed is opened. If there are no more feeds to open, the Status of the object is changed to 'done' and the timeout is destroyed.

Also note that if the timeout object contained the only reference to this auto-updater object, then destroying the timeout will also destroy the updater (which is convienient). Here's the final updater script (changes from the previous version outlined above are highlighted in blue):

-- "SubscriptionMgr" Parent Script

-- @version 1.0



property SubscriptionsToRefresh

property OpenFeeds

property ActiveFeed

property Status





on new (me, subscriptionMgr)

  SubscriptionList = subscriptionMgr.GetSavedSubscriptions()

  SubscriptionsToRefresh = SubscriptionList.duplicate()

  OpenFeeds = subscriptionMgr.GetOpenFeedsList()

  downloading =  me._OpenNextSubscription()

  if downloading then 

    aTimerObj = timeout().new("subscriptionMgr-Updater", 100, #Timer_CheckProgress, me)

  end if

  return me.script

end





on _OpenNextSubscription (me)

  -- check whether there is anything left to check

  if SubscriptionsToRefresh.count = 0 then

    return false

  else

    -- have we already created a feedOBj for this Url?

    site = SubscriptionsToRefresh[1]

    SubscriptionsToRefresh.deleteAt(1)

    feed = OpenFeeds.getAProp(site.url)

    if feed.ilk <> #instance then

      -- haven't created it yet

      ActiveFeed = script("RSS-Feed").new(site.url)

      put ActiveFeed.URI && " ->  updating..."

      OpenFeeds.AddProp(site.url,ActiveFeed)

      return true

    else

      put Feed.URI && " already opened"

      return me._OpenNextSubscription()

    end if

    

  end if

end



on Timer_CheckProgress (me, timerObj)

  -- First step is to check whether

  --  we have finished or not

  currentState = me._CheckUpdateProgress()

  -- if we've finished, then forget the

  -- timerObj. If we are running as a Daemon

  -- (ie the timeout held the only reference to

  -- this object, then this will destroy this object)

  if status = #Done then

   timerObj.forget()

  end if

end



on _CheckUpdateProgress (me)

  if ActiveFeed.ilk <> #instance then 

    status = #Done

    

  else

    if ActiveFeed.GetStatus() = "Done" then

      put ActiveFeed.URI && " ->  done!"

      -- tell the listeners that the feed list has been updated

      sendAllSprites(#UpdateSubscriptionList)

      ActiveFeed = VOID

      if me._OpenNextSubscription() then

        status = #Working

        

      else

        -- all done!

        status = #Done

      end if

    end if

  end if

  return status

end

Note - this uses the DMX2004 syntax for creating timeouts. If you are using an older verison of Director, change the syntax to timeout("subscriptionMgr").new(100, #timer_checkProgess, me)

You may also note that the new() method is returning a reference to the script rather than a new instance of the script (effectively, overriding the normal result of a new() function). This is simply to ensure that there is never more than one auto-updater running at the same time (since there is only ever one occurance of the script object).

Instantiating the Auto-updater.

The final step is to instantiate the Auto-updater. Since it requires a reference to the SubscriptionMgr object, we can either instantiate it in the SubscriptionMgr script or we can create it when we create the SubscriptionMgr object (which occurs in the GUI-Framework script). Since its role is closely linked to the SubscriptionMgr - and will probably want to add some logic later if we create a 'refresh all' button (to make sure we don't have more than one Auto-updater running at the same) - we'll create it when the subscriptionMgr is created. Here is the modifed new handler of the SubscriptionMgr (changes hilighted in blue):



-- "SubscriptionMgr" Parent Script

-- @version 1.0



property SubscriptionList

property OpenFeeds

property UpdateDaemon





on new (me)

  

  SubscriptionList = me.GetSavedSubscriptions()

  OpenFeeds = [:]

  ActiveFeed = VOID

  

  -- Create autoupdater. Note that we are not keeping a persistent

  -- reference to the updater object here. The updater will create

  -- a timeout which will store a reference to the updater. This is 

  -- sufficient to keep the updater alive (until it finishes its

  -- job, at which point it will forget the timeout and thereby 

  -- destroy itself). 

  autoUpdater = script("SubscriptionMgr.Updater").new(me)

  

  return me

end

Note also that the new instance of the updater object is being put into a local variable ('autoUpdater') simply because in the past, Director had a bug which seemed to mean that objects returned from a 'new' function call would be stored in some secret inaccessible location (possbily related to "the result" supercglobal) unless you explicitly put the instance somewhere. This bug was definately present in the case of creating timeout objects (its not clear whether it applied to other uses of the 'new' function). In any event, it doesn't hurt to put the returned instance in alocal variable since the local variable will be cleared when the current handler finishes.

Summary of Part 3

So far, we have created a basic RSS Reader that we can interact with via a crude GUI. The GUI will now indicate what feeds contain unread items. The Reader will automatically update all the feeds you have subscribed to when it starts up. If you view a feed, the data that was last downloaded will be used until the feed is updated.

The main point of the exercise so far is not to explain how best to create an RSS Feed Reader Application in Director. Rather, the intention has been to show how thinking in terms of interacting objects, a relatively complex application can be built up by creating a few objects with specific roles. Also - hopefully - it hilights how writing neatly encapsulated scripts helps keep it resuable. The application so far only uses a handful of of scripts written specifically for this application. Most of the scripts used are taken from a generic library of scripts. For example, various objects in this application use "netOp' objects for downloading data. These NetOps are neatly self-contained objects that operate asynchronously, making callbacks to specified 'listener' objects. There is no need to scatter "on exitframe if netDone(gNetID) then" type scripts around the place (which can be a nightmare trying to manage).

Downloads

Here is a source movie containing the scripts discussed.

First published 25/03/2006