Welcome to Geeklog, Anonymous Sunday, December 22 2024 @ 08:39 am EST
Geeklog Forums
SPAMX Empty delimiter error
Status: offline
AA6QN
Forum User
Junior
Registered: 12/30/06
Posts: 16
I have a fresh install of Geeklog 1.4.1 When I try to change data in the My Accounts personal section I get the error below.
2 - strpos() [function.strpos]: Empty delimiter. @ /some-location/geeklog/plugins/spamx/SLVbase.class.php line 159
If I disable the SPAMX plugin, the personal updates work fine.
The install went well. The only difference is that I am using: $_CONF['site_url'] = ''; for site url in the /geeklog/config.php since the server is headless in a DMZ.
Other than the error above all seems to be working 5x5
Best Regards, JohnF
2 - strpos() [function.strpos]: Empty delimiter. @ /some-location/geeklog/plugins/spamx/SLVbase.class.php line 159
If I disable the SPAMX plugin, the personal updates work fine.
The install went well. The only difference is that I am using: $_CONF['site_url'] = ''; for site url in the /geeklog/config.php since the server is headless in a DMZ.
Other than the error above all seems to be working 5x5
Best Regards, JohnF
18
17
Quote
Status: offline
Dirk
Site Admin
Admin
Registered: 01/12/02
Posts: 13073
Location:Stuttgart, Germany
Quote by: AA6QN
IThe only difference is that I am using: $_CONF['site_url'] = ''; for site url in the /geeklog/config.php since the server is headless in a DMZ.
And that's the problem exactly, since the error occurs where it checks if the links contain the site's URL so as not to report them to SLV.
Try this:
Text Formatted Code
if (!empty ($_CONF['site_url']) && strpos ($url, $_CONF['site_url']) === 0) {(for line 159 in SLVbase.class.php)
You should also add your site's domain name to the SLV whitelist then.
bye, Dirk
14
15
Quote
Status: offline
AA6QN
Forum User
Junior
Registered: 12/30/06
Posts: 16
Your fast!!
Yes I did take at look at:
* Extract links
*
* Extracts all the links from a post; expects HTML links, i.e. <a> tags
*
* @param string $comment The post to check
* @return array All the URLs in the post
*
*/
function getLinks ($comment)
{
global $_CONF;
$links = array();
preg_match_all( "/<a[^>]*href=[\"']([^\"']*)[\"'][^>]*>(.*?)<\/a>/i",
$comment, $matches );
for ($i = 0; $i < count ($matches[0]); $i++) {
$url = $matches[1][$i];
if (strpos ($url, $_CONF['site_url']) === 0) {
// skip links to our own site
continue;
} else {
$links[] = $url;
}
}
return $links;
}
And figured something was amiss with the missing site url.
I will give your patch a go and I did add the site to the whitelist.
Blessings, JohnF
Yes I did take at look at:
* Extract links
*
* Extracts all the links from a post; expects HTML links, i.e. <a> tags
*
* @param string $comment The post to check
* @return array All the URLs in the post
*
*/
function getLinks ($comment)
{
global $_CONF;
$links = array();
preg_match_all( "/<a[^>]*href=[\"']([^\"']*)[\"'][^>]*>(.*?)<\/a>/i",
$comment, $matches );
for ($i = 0; $i < count ($matches[0]); $i++) {
$url = $matches[1][$i];
if (strpos ($url, $_CONF['site_url']) === 0) {
// skip links to our own site
continue;
} else {
$links[] = $url;
}
}
return $links;
}
And figured something was amiss with the missing site url.
I will give your patch a go and I did add the site to the whitelist.
Blessings, JohnF
19
14
Quote
All times are EST. The time is now 08:39 am.
- Normal Topic
- Sticky Topic
- Locked Topic
- New Post
- Sticky Topic W/ New Post
- Locked Topic W/ New Post
- View Anonymous Posts
- Able to post
- Filtered HTML Allowed
- Censored Content