1

(17 replies, posted in Help for TF users)

Thanks. It's good to see some conversation on this.

Personally, my project is already on Savannah. I just need to find suitable hosting for the website. Codeberg does seem to have a "pages" feature, which could work. Or I will have to look at setting up my own server.

2

(17 replies, posted in Help for TF users)

I just found out about this. It is really sad to hear, but I understand as time has passed it has gotten challenging to keep everything running and to stay motivated. I am truly, deeply, thankful for all the years TuxFamily has provided hosting for the website of my project and it is something I will always remember. Thanks for all the hard work. I want you to know that you have brought much happiness to your users.

Now I need to check if the "downloads" files can still be downloaded over ssh. I got backups, of course, but I want to be sure I'm not missing some files.

And I would love to see some discussion on alternatives for site hosting. Right now my project is primarily on Savannah (nongnu) which has most features and the right philosophy, while Tuxfamily provides the website with a custom domain name. I don't think Savannah supports custom domain names. And even though github and gitlab provides hosting for static pages, I want to stick to more freedom respecting services. If nothing else, I guess I could set up a server at home.

3

(1 replies, posted in Thanks)

... I really should have written this long ago:

Thank You to everyone on TuxFamily!

Some background: the project I am working on has moved around a lot,
from sourceforge to github and finally Savannah. What drove me to
Savannah was that (unlike the others) it felt like it genuinely valued
software freedom. The only problem with savannah has been the lack of a
forum, something which has always been important for the project: it was
originally created from forum discussions and got its main driving
force+motivation from the community. The project had a website and forum
hosted on sourceforge, by another person. He eventually disappeared and
it went down. After a while the project got a new forum (and a real
domain name) co-hosted on a server by a friend (who also originally
introduced me to git, and ported the code to c++), but the server only
lasted for about a year.

Then I finally discovered TuxFamily, and realized this was the hosting I
had been searching for the whole time... :)

In short: the project is still partially hosted on Savannah (most
notably the git repo), but for more than a year now the site and forum
has been hosted on TuxFamily, and it has been the best experience in all
years it has existed. There may be more things changing in the future
(possibly even a namechange for the project), but I know that as long as
TuxFamily exists I will always have a place to call "home."

:)

4

(14 replies, posted in Suggestions)

Sorry for writing in English, my French is close to non-existing but I really wanted to reply to this.

TuxFamily was the first thing I thought of when I heard of "Let's Encrypt" (on SoylentNews). Not because of the goal to make ssl configuration easier, but because it might provide an alternative to the self-signed certificate. If it gains traction (seeing mozilla, eff and identrust on the list of sponsors makes me hopeful).

Don't get me wrong, the current self-signed solution works great for the important stuff (ssh&git obviously, but also web-interfaces for CMSes), but having https with a certificate already accepted by default in most browsers (without warnings and having to add exceptions) would give a very good impression for the more casual visitors to projects sites and forums.

Thanks for your reply!

First of all: Very interesting to hear about the path (automatic and based on hashing). I did not know that. I just assumed it was meant to provide certain ad-hock possibilities (migration/backup/raid).

I also did not know about that possibility of accessing environment variables through .htaccess. I've changed my rules (just in case the path ever does change) and it seems to work perfect! I hope it doesn't cause any big performance impact on apache (checking a global variable compared to a built-in)?


With the tweak applied, the following is now my current .htaccess:

AddDefaultCharset UTF-8
Options -Indexes

RewriteEngine On
RewriteBase /


#funky cache:
# Check for cached index page from static cache folder.
RewriteCond %{REQUEST_METHOD} ^GET$
RewriteCond %{ENV:DOCUMENT_ROOT_HASH}/cache/index.html -s
RewriteRule ^$ cache/index.html [L]

# Check for other cached pages from static cache folder.
RewriteCond %{REQUEST_METHOD} ^GET$
RewriteCond %{ENV:DOCUMENT_ROOT_HASH}/cache%{REQUEST_URI} -s
RewriteRule (.*) cache%{REQUEST_URI} [L]
#


RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
# Main URL rewriting.
RewriteRule ^(.*)$ index.php?WOLFPAGE=$1 [L,QSA]

Thanks again for all your help!

I decided to copy&paste the current value of %{DOCUMENT_ROOT_HASH}, since it unfortunately can not be used directly from .htaccess. But the path seems to remain unchanged so hopefully this "hard coded" solution will keep working.

Worst case scenario is that the path changes without me noticing, resulting in the cache checking failing. In that case every page and cache will be generated from php at every visit, resulting in a bit more load. But I will try to check the cache status from time to time, in case anything happens.

My full .htaccess file:

AddDefaultCharset UTF-8
Options -Indexes

RewriteEngine On
RewriteBase /


#funky cache:
# Check for cached index page from static cache folder.
RewriteCond %{REQUEST_METHOD} ^GET$
RewriteCond /data/web/d3/fa/2d/recaged.net/htdocs/cache/index.html -s
RewriteRule ^$ cache/index.html [L]

# Check for other cached pages from static cache folder.
RewriteCond %{REQUEST_METHOD} ^GET$
RewriteCond /data/web/d3/fa/2d/recaged.net/htdocs/cache%{REQUEST_URI} -s
RewriteRule (.*) cache%{REQUEST_URI} [L]
#


RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
# Main URL rewriting.
RewriteRule ^(.*)$ index.php?WOLFPAGE=$1 [L,QSA]

When I edited my last post it got a bit unorganized, so here's a shorter description of my current problem:

DOCUMENT_ROOT_HASH seems to be exactly what I was looking for, and I can obtain it through phpinfo(). But I seem unable to use it in ".htaccess".

A rule that matched positively in the htaccess was: "RewriteCond %{DOCUMENT_ROOT_HASH} ^$", which should mean it's not set/available?

I guess it's possible that htaccess only got a subset of all variables. Would it be okay to copy the string from %{DOCUMENT_ROOT_HASH} directly into the htaccess or is there a chance it might change? I know it's probably not a good idea, but I felt the need to ask.

Thanks again for such a wonderful service!


update: okay, looking at this thread on the forum, it seems like the path to the htdocs does not change and the contents of DOCUMENT_ROOT_HASH can be copied explicitly to the htaccess file. Is this correct?

Update: Problem (might be) solved! My complete .htaccess can be found in the last post. I am still not sure if the full web path might change in the future, but it works for the moment. The following is my old question:


Hello everybody!

First of all I want to thank everyone at TuxFamily for an amazing service! Apart from the usual project hosting (like git and download repository), the website features (like php and mysql) are simply unbelievable!

Now to the problem:

I decided to try to implement caching to the website for my project, in order to minimize database and php usage. While the website is lightweight and got few visitors I still want to do as much as I can to minimize any load. When the (php generated) pages are visited, static pages are created in a subdirectory called "cache" and the plan is to check for existence of these static files using RewriteCond in ".htaccess" and redirect to them using RewriteCond.

In short: I have been able to successfully check for file existence using:

RewriteCond ${REQUEST_FILENAME} -f

But since I need to check for files in a modified path (the static files are stored in a subdirectory), I need to use the %{DOCUMENT_ROOT} and ${REQUEST_URI} variables (to add "/cache/" between).

The problem is: when trying to use "RewriteCond" without ${REQUEST_FILENAME} it always seems like the file is missing. For example, replacing the above with:

RewriteCond %{DOCUMENT_ROOT}${REQUEST_URI} -f

Should (in most cases) produce the same behaviour, but instead fails for me.

I'm suspecting this might be an intentional security feature, but I thought I would ask and see if I were doing something wrong?

Thanks Again!