admin+database tools/

The following web-based utilities provide raw access to your Wikis database and allow for common administration tasks. They are separate from the main wiki and are inaccessible without password for obvious reasons. You can however use the account "readonly" with password "test" on your local computer or our demo site.


page flags

Every page can have multiple control flags and feature settings enabled, like _DISABLED, _READONLY, _APPENDONLY, ... But this tool does not show all of them.


remove pages

You can completely delete unused or unwanted pages from the database. This tool even analyzes all entries and gives you recommendations on which pages to purge.


strip old versions

Every page edit adds an "archive version", which is good to revert unwanted changes. You should however clean up a few and free database space from time to time.


check link directory

Can check external hyperlinks, if you have pages which collect such (link directories).


This tool from the plugins/admin/ section allows to delete and rename a , or to set the flags or meta data fields in it:


text insert

Insert a collection of plain text files as wiki pages into the database (no versioned files). The files must already reside on the Web server (like files init-pages/ dir).



If you don't have this powerful plugin loaded, then you can access it at least from here, and upload pages from many different file formats and even multiple pages packed in .zip or .tar files.


Archives all your pages in backup files (available in different formats) for archival / later emergency restoration. You need a writeable data directory on your Web server.



Restore your database from archived backup files, which you must upload before into a directory on the Web server.


transfer file

Use this tool to quickly get a backup of all Wiki pages or to restore it later. You directly upload or get a data file in binary format, which eases moving pages from one installation to another. With the MiniDump plugin you could get a tar.


database convert

This tool can migrate pages from a few known foreign Wiki databases to the DB scheme used by ewiki. It can also export back to PhpWiki now.


mass revert

Use this tool to undo mass changes made by an attacker or an automated wiki garbaging script. The offending revisions are discovered by providing the IP or host name under which the edits were stored.

search and replace

Provides a global search feature, which can even use regex to find string occourences and replace them with defined text fragments. It can run a safe dry test.



Transfer the contents of this Wikis database onto a remote installation or the other way round. Useful for editing pages offline, while keeping a public Wiki installation up-to-date.

power toys

Some of the older and more simple tools on the left have been surpassed by more powerful/combined ones (which on the other hand are then not easier to use for everyone).


The WikiCommander provides an easy file viewer like interface to inspect and modify pages and all their internal flags and fields. You can also delete, rename or duplicate pages. It even provides a shell-like interface for inserting common actions into a "commandline".
So this effectively replaces a few of the early utilities.


ewikictl is a powerful commandline utility, but you need shell (ssh) access on the Web server to use it. There is however now this web-based frontend for all other users, which gives access to most of its commands.

setup wizard

With the SetupWizard you can more easily create a custom config.php script or even a monsterwiki.php file (combination of the ewiki core and plugin files). This saves you looking into README.plugins if you can live without a few advanced features.

run cron.d/

If you haven't a cron daemon you can start the automated database admin tools by hand on occassion. They however need a bit configuraton prior use (very little of it is enabled per default).


If you had the right plugins loaded, you'd also get in-Wiki powertools like:
WikiDump and the enhanced WikiDump2 can make a tarball with static .html files from all your pages
use TextUpload for uploading many pages in a ZIP or Tarball
MiniDump to get a tarball with raw text of all pages (for later reinsertion)