Welcome to Geeklog, Anonymous Sunday, November 24 2024 @ 12:56 am EST
Geeklog Forums
RSS feeds, Coldfusion, and Geeklog
Status: offline
manimal
Forum User
Newbie
Registered: 03/18/05
Posts: 1
I am a webmaster for an aerospace company and enjoy mixing and matching all manner of portal tools and technologies.
I recently brought Geeklog to my workplace to solve problems for our department websites, and found that when I tried to tie in RSS/RDF news feeds to my local portal, that a connection error occured. Try as I might, nothing worked until I hit on a simple solution.
This seemed odd to me, since I had could reach the news feed URL via any browser, so I researched the often helpful discussions here on the Geeklog.net site and found that many others have experienced the same problem.
I saw the references to the "allow_url_fopen = On" in .htaccess or the php.ini control file, of which my particular site had already be configured to match.
Still... no luck in getting the feeds to work correctly, until I hit on the idea of using my trusty ColdFusion scheduler and a simple script to solve the problem.
Essentially, I created a cache of feeds with the use of the Coldfusion scheduler and the use of a direct call to the news feed url, which then was captured and stored to a local URL on my own server.
Be aware that this is only one potential way to solve the problem. I used Coldfusion since it was already on my development server, but anyone could easily replicate the technique in other languages and to use a scheduler like cron or some similar tool to automate the process.
In Coldfusion, the adminsitrator can submit jobs to the scheduler on a recurring basis. My local site plugs into NASA and CNN based news feeds that my audience finds interesting, so I have one scheduler task for each call. It would be very easy to alter the code to use a loop with a call to a web service or similar to reduce the number of items in the scheduler.
Our RSS caching script uses the following w URL structure:
<a href="http://somecfserver.com/feeds/xmlfeed.cfm?urlAddress=http://rss.cnn.com/rss/cnn_tech.rss">http://somecfserver.com/feeds/xmlfeed.cfm?urlAddress=http://rss.cnn.com/rss/cnn_tech.rss</a>
The important element is to use the cgi argument "urlAddress" to get feed of your choice to record to the local story cache.
The script is represented below:
------------------------------------------------
<!--- RSS news feed utility for automatic feed collection --->
<!--- Set the URL address - Default to the Macromedia news feed --->
<cfparam name="urlAddress" default="<a href="http://www.macromedia.com/desdev/resources/macromedia_resources.xml&quot;&amp;gt">http://www.macromedia.com/desdev/resources/macromedia_resources.xml"></a>;
<!--- Grab the last name in the submitted list --->
<cfset filename = ListLast(urlAddress,"/">
<!--- Go to the remote URL and capture it's content --->
<cfhttp url="#urladdress#" method="GET" proxyserver="yourproxyaddress.com" proxyport="8080"></cfhttp>
<!--- Write the XML file to cache for use with internal Geeklogs --->
<cffile action="write" file="c:inetpubwwwrootfeeds#filename#" output="#CFHTTP.FileContent#">
------------------------------------------------
Remember, this is a "down and dirty" solution... It is not meant to be elegant. In my case, I did not feel like a big investigation on how to get our network operations to change some rules or to investigate a problem I could handle on my own... As far as the code design goes, I decided to keep the initial process simplified until I had some more experience with it and could write a more elegant and portable solution.
I hope this helps some of you. If you wish to contact me, drop an email to me and I'll be glad to get in touch with you.
Jon F. Almada
Jon F. Almada
I recently brought Geeklog to my workplace to solve problems for our department websites, and found that when I tried to tie in RSS/RDF news feeds to my local portal, that a connection error occured. Try as I might, nothing worked until I hit on a simple solution.
This seemed odd to me, since I had could reach the news feed URL via any browser, so I researched the often helpful discussions here on the Geeklog.net site and found that many others have experienced the same problem.
I saw the references to the "allow_url_fopen = On" in .htaccess or the php.ini control file, of which my particular site had already be configured to match.
Still... no luck in getting the feeds to work correctly, until I hit on the idea of using my trusty ColdFusion scheduler and a simple script to solve the problem.
Essentially, I created a cache of feeds with the use of the Coldfusion scheduler and the use of a direct call to the news feed url, which then was captured and stored to a local URL on my own server.
Be aware that this is only one potential way to solve the problem. I used Coldfusion since it was already on my development server, but anyone could easily replicate the technique in other languages and to use a scheduler like cron or some similar tool to automate the process.
In Coldfusion, the adminsitrator can submit jobs to the scheduler on a recurring basis. My local site plugs into NASA and CNN based news feeds that my audience finds interesting, so I have one scheduler task for each call. It would be very easy to alter the code to use a loop with a call to a web service or similar to reduce the number of items in the scheduler.
Our RSS caching script uses the following w URL structure:
<a href="http://somecfserver.com/feeds/xmlfeed.cfm?urlAddress=http://rss.cnn.com/rss/cnn_tech.rss">http://somecfserver.com/feeds/xmlfeed.cfm?urlAddress=http://rss.cnn.com/rss/cnn_tech.rss</a>
The important element is to use the cgi argument "urlAddress" to get feed of your choice to record to the local story cache.
The script is represented below:
------------------------------------------------
<!--- RSS news feed utility for automatic feed collection --->
<!--- Set the URL address - Default to the Macromedia news feed --->
<cfparam name="urlAddress" default="<a href="http://www.macromedia.com/desdev/resources/macromedia_resources.xml&quot;&amp;gt">http://www.macromedia.com/desdev/resources/macromedia_resources.xml"></a>;
<!--- Grab the last name in the submitted list --->
<cfset filename = ListLast(urlAddress,"/">
<!--- Go to the remote URL and capture it's content --->
<cfhttp url="#urladdress#" method="GET" proxyserver="yourproxyaddress.com" proxyport="8080"></cfhttp>
<!--- Write the XML file to cache for use with internal Geeklogs --->
<cffile action="write" file="c:inetpubwwwrootfeeds#filename#" output="#CFHTTP.FileContent#">
------------------------------------------------
Remember, this is a "down and dirty" solution... It is not meant to be elegant. In my case, I did not feel like a big investigation on how to get our network operations to change some rules or to investigate a problem I could handle on my own... As far as the code design goes, I decided to keep the initial process simplified until I had some more experience with it and could write a more elegant and portable solution.
I hope this helps some of you. If you wish to contact me, drop an email to me and I'll be glad to get in touch with you.
Jon F. Almada
Jon F. Almada
14
15
Quote
All times are EST. The time is now 12:56 am.
- Normal Topic
- Sticky Topic
- Locked Topic
- New Post
- Sticky Topic W/ New Post
- Locked Topic W/ New Post
- View Anonymous Posts
- Able to post
- Filtered HTML Allowed
- Censored Content