mlepisto
Forum Replies Created
-
Forum: Fixing WordPress
In reply to: Googlebot cannot access CSS and JS filesomurphy, you’re welcome to do as you wish.
Your response you quoted are specific to the wp-admin pages.
Theme files – which aren’t in the wp-admin directory – and may include items such as timthumb or other automatic image resizing can cause a significant server performance hit when crawled, especially if many of them are crawled successively as I mentioned before.
Forum: Fixing WordPress
In reply to: Googlebot cannot access CSS and JS filesmlepisto what about if the .js file is inside of wp-admin and you are disallowing that…
This will still work if the allow lines are after the deny lines – aka the very bottom of the robots.txt file. It overrides the disallow with allow for any pattern that matches /*.js* in your case.
There could be some unintended consequences for rare cases
So if you have a file in /old/something.js and for example, that would be allowed even though /old/ may be disallowed.My example is a “fits 99% of DIY webmasters” type of example so keep that in mind.
Forum: Fixing WordPress
In reply to: Googlebot cannot access CSS and JS filesLeave your robots.txt blank, unless there is a specific directory you need to block such as affiliate directory or another. There is no need to block wp directories such as wp-admin or wp-includes any more. This is what is causing the issue.
I disagree with this. You definitely want to block directories you don’t want crawled regularly or more importantly, indexed. While crawling isn’t necessarily respected by all, it can reduce duplicate content (say thumbnails of images by googlebot-image) and if these are created by code on the fly (timthumb for example) it will increase server load.
So a properly managed robots.txt file will help you reduce server load, make sure duplicate or irrelevant content isn’t indexed at a very minimum.
Forum: Fixing WordPress
In reply to: Googlebot cannot access CSS and JS filesIf you add the following lines, to the very bottom of almost any robots.txt file, it will fix these issues:
User-Agent: Googlebot
Allow: /*.js*
Allow: /*.css*I have step-by-step instructions on how to update Google Webmaster Tools that you’ve updated your robots.txt file, and to how to verify the changes work posted here as it is rather lengthy.
You will need access to your actual robots.txt file to do this and it is not WordPress specific and will work on almost any site.
However – if you have a plugin that manages robots.txt for you, you may need to update this code in the plugin’s interface, or contact the plugin developer for additional support.
Forum: Plugins
In reply to: [Plugin: SEO Smart Links] Linking a word in an image path and showing that!I’m also having the issue with it linking a word that exists in an image filename. The only workarounds I have been able to use so far is to
1. Disable automatic links on that post/page
2. Change the filename
3. Disable the plugin entirelyWould love a fix for this. It’s kind of sad to see that the premium version includes bug fixes and features while the free version has bugs. I’m all for the freemium model, but bug fixes should be part of everything, while features should be part of premium.
That did it, thanks.
yeah, I get that.
I have deleted the *directories* and *files* but the web-accessible supercache configuration page within wp-admin still shows a list of those files which it thinks it should create cache files for. I’m trying to find where it keeps the list of the ones it thinks it should have so I can manually remove them from that list also.
Hi,
I don’t have that exact directory, only wp-content/cache/supercache
I did clear that out completely, but when I look wpmu-admin.php?page=wpsupercache for the site it still shows ‘existing direct page’ even though the direct page and the cache directories are removed.
Does it store the list of what it thinks is directly cached in the db somewhere?
Thanks!
Forum: Themes and Templates
In reply to: Query pages by partial title matchHi Wex,
Thanks for the responses.
I did just end up doing a query to get all pages, then just put the data into an array and working with it from there. Not quite as efficient as I would like, but it works better than no solution ??
Forum: Fixing WordPress
In reply to: Diff for a Wp 2.0 compatible plugin – Feedwordpresspaulproteus:
First, thanks for posting that file!
I’m wondering though, what is the optional folder for the rss-functions.php file? Do you know what the changes to that file are?
Thanks,
MikaForum: Fixing WordPress
In reply to: Multiuser Image Posts?I am struggling with creating a multiuser blog (ie, blog hosting, not just multi authors) based off WP2.
I just updated to 2.0.1 but the image functions still need some work. I am seriously considering going back to WP1.5.
Here’s the problems I face:
1. No way to limit the image upload file size through the interface.
2. Image posting in the WYSIWYG is horrible. It acts nice, looks nice, but the sizing/resizing doesn’t work because:
a) using the sizing function makes it disproportionate
b) if the image is larger than the WYSIWYG screen it’s a blunder.I know how to do this without the WYSIWYG but I’m trying to create a solution for people who don’t. I’m just thinking of writing a custom image upload function (not a plugin) that allows me to specify the size, auto thumbnail, and set max pixel dimensions on the “large file”. I really don’t want to because I’m not great at PHP, so if someone else wants to I will be super delighted…
In summary, I would stick with 1.5 if you are already using it because there are more plugins out there that work, and let 2.0 have the bugs worked out of it.
Now, this is in NO WAY a slam on the people who wrote this fine WordPress system. WP2 is a host better in just about every way including the image system if it would allow just a bit more management level control. Keep up the good work & Thanks for WP!