ABTUK webmaster
Forum Replies Created
-
Forum: Plugins
In reply to: [Contact Form 7] Save images to Media LIbraryI have the same requirement and would be interested to learn if it is possible (and if so, how).
Thanks.
cf7msm_get( 'cf7msm_posted_data' )
returns an array.I achieved what I wanted by replacing:
$_POST['answer_11']
by:cf7msm_get( 'cf7msm_posted_data' )['answer_11']
Any thoughts?
Any thoughts on how to prevent these back slashes being added at every use of the Next button?
Yes I was talking about the view in the admin page.
Your option 1 supports searches only through the subset of the table that is currently available at the browser. Since I very much like the search facility and my database currently has under 100 rows, I don’t want to set “Maximum number of rows to retrieve from the DB for the Admin display” (but I accept it may be needed in future if the database size increases significantly).
Your option 2 (the “Show # entries” control) gives me pretty well everything I need. And within CF7DB Options, I can set its default value at #Rows (of maximum above) visible in the Admin datatable
Why didn’t I explore those before now !!Thank you for your prompt and precise reply. Marking topic as resolved.
In my previous post (immediately above) I wrote “It is indeed CloudFlare.”
While it’s true that CloudFlare wasn’t caching the responses to GET requests (thereby being unable to respond to later HEAD requests without issuing another GET to the origin server), this happened because I was unaware that .htaccess at my site included Header set Cache-Control “…, private, …” – which basically says “proxies: do not cache”. After I located and changed that to Header set Cache-Control “…, public, …” everything worked as I hoped it would.
So it was a simple user error, aka my fault, all along. If my .htaccess settings had not prevented CloudFlare from doing its job, the “problem” described in this topic just wouldn’t have happened.
In case anyone’s interested, here’s the response I got from CloudFlare that enabled me to find the error:
Asset is a cacheable file extension, and is in our cache – GET & HEAD requests are returned without hitting the origin
Asset is not a cacheable file extension – GET & HEAD requests are proxied to the origin
Asset is not a cacheable file extension, and it is NOT in our cache – GET or HEAD request is proxied to the origin and converted to a GET, so that we can serve subsequent GET/HEAD requests from our cache.If you think the behaviour you’re seeing is different, it may be that your origin is serving headers that are instructing us not to cache. As an example, typically servers send no-cache headers with most non-200 response codes. E.g. a 404 will be served with no-cache. In that case we’ll respect that.
Regards, Pete
Hello Janis,
Thanks for your in-depth analysis and conclusions that you emailed to me. It is indeed CloudFlare.
If I set Development Mode (=no caching) in CloudFlare, my Apache log shows that BLC issues only HEAD requests for all files regardless of filetype, and the bandwidth attributed to BLC is now almost zero.
I will raise this with CloudFlare and let you know the result. In the meantime, I have unset CloudFlare Development Mode and created CloudFlare Page rules to bypass cache for 2 URLs:
– *.[domain-name]/directory1/directory2/* everything in that directory
– *.[domain-name]/*.pdf all PDFs anywhere on the websiteIt’s a compromise between the benefits of CloudFlare caching for normal visitors and any requirement to minimise the bandwidth used by Broken Link Checker. Other websites might want to define things differently.
Let’s hope that CloudFlare come up with a decent permanent solution.
Thanks again,
Pete (ABTUK webmaster)More info that might give you some clues:
If I temporarily modify my .htaccess RewriteRules to reject all HEAD requests with HTTP404 (not found), then for .zip and .rtf filetypes (the only ones for which HEAD is issued), the Apache logs show:
– HEAD request being rejected HTTP404, followed by a
– GET request with HTTP200 (success), and length 2049
So your logic in http.php appears to be getting control, but not for all filetypes. Also it seems my server honours Range request headers.In the next few minutes I will email you further information (Apache logs, screenshots, etc) plus some links you can test with.
Regards,
PeteThanks for the speedy reply. I issued a “Force Recheck” a few minutes ago and the only situations in which BLC issued HEAD were for:
WP permalinks to all pages of my site
.zip files
.rtf files
All other accesses were GET requests without any prior unsuccessful HEAD. The links involved are not WP permalinks but for explicit files with names ending .jpg .xls .pdf etc).I’ve defined BLC to check every 72 hours, and the Apache access log shows that this is indeed what happens. I realise I could reduce this frequency.
None of my links are defined as HTTPS, and the Apache access log reports all BLC accesses as having Protocol HTTP/1.1
The Apache log says that the full file size was downloaded for all GET requests, so I guess that my server doesn’t support Range headers ??
Any ideas? I can supply the Apache log and any other info you need.
Pete
I’m pleased that you understand my need, and that you agree it could be a worthwhile future enhancement. It would certainly make my “real” form (whose page 2 contains many more fields than the “test” form you have seen) much more user-friendly after a Back/Next sequence. I look forward to this becoming available in future.
Since the problem I originally reported in this topic is now addressed by me correcting my own “Cache-Control: max-age=…” definition, I’ll mark the topic as resolved. Thanks, and have a great Christmas/New Year break.
Regards,
Pete SimpsonThanks for your speedy response. After adjusting page 2 of my form to resemble your page 2 (i.e. to provide a textarea instead of a date), and retesting, I agree that my form works the same way yours does, and that neither form achieves what I would like to happen.
What happens (on both your form and mine):
– complete page 1 fields, and press Next
– complete page 2 fields, and press Back (to redisplay page 1)
– press Next. All field values previously entered on page 2 are gone.It would be nice if the previously-entered page 2 values could be retained over a Back/Next sequence. Could the plugin commit these values before performing its “Back” processing (in the same way as I presume it does before performing its “Next” processing)?
Regards,
PeteFound it. My fault.
The response headers for all HTM[L] and PHP files sent to browsers from my website included:
Cache-Control: max-age=31, private, proxy-revalidateThis caused pages to be redisplayed from the browser’s cache if the cached version was less than 31 seconds old. This value dates from a while ago when my website served little or no dynamic content. But it is obviously not appropriate for (relatively) rapid use of CF7-MSM “Next” and “Back” buttons.
After removing this response header I can no longer recreate the problem at my website. Apologies for wasting your time.
… but I would still like to know how, on your own website, you repopulate the page 2 fields after Back + Next. When I provide field values on page 2 and then click Back + Next, all field values are lost.
Regards,
PeteHello Webheadllc,
I notice that you tested a few times today on my website. Great ! Were you able to reproduce the problem there, and see for yourself what I tried to describe in my earlier posts to this topic?
I also cannot reproduce the problem on your test setup at webheadcoder.com/contact-form-7-multi-step-form.
I notice one difference between the way your site works and mine. If I supply a field value on the 2nd page and then click Back + Next, the 2nd page field value is gone. On your site the 2nd page field value remains. Could this be relevant?
Thank you for your suggestion. I commented out what I hope are lines 81-83 (it’s difficult to tell because line numbers are not displayed in WP’s plugin editor):
// if ( !empty( $cf7msm_posted_data ) && is_array( $cf7msm_posted_data ) ) { // $value = isset( $cf7msm_posted_data[$name] ) ? $cf7msm_posted_data[$name] : ''; // }
and inserted your suggested logic immediately thereafter.
The problem still appears to exist.
However, the following may provide you with some clues: using Chrome Incognito + developer tools | Network, if I tick the “Disable cache” box I can no longer reproduce the problem. And if I untick this box, the problem returns. I cycled around this several times, each time trying several times before changing the ticked|unticked setting. The results appear to be consistent: caching enabled = problems, caching disabled = no problems.
Any ideas?
Plugin v1.4.3 fixes the similar symptoms I had. I’ll leave it to wynot (the original poster) to retest with v1.4.3 and (hopefully) flag this topic as resolved.
Plugin v1.4.3 fixes the similar symptoms I had. I’ll leave it to Marry123 (the original poster) to retest with v1.4.3 and (hopefully) flag this topic as resolved.