A simple site search engine with php and sqlite
I recently quoted on putting in place a search engine on a customers website, I had suggested on using the google custom search facility thinking it would save me plenty of time and do just what he wanted. After some deliberation between myself and the client it was decided that the google search would just not be suitable as it also lists competitors websites among a number of other reasons. In the end because of the simplicity of the site and what i needed i finally decided that i would create a simple one myself. The website was built in a very simple way with each page being a php page and within this includes to a header.php and footer.php file with the main content being displayed in between these 2 includes. This is quite a common setup with a simple php site and is usually the basic setup taught to beginners of php. Anyway in order to get this up and running i setup a config file and a script i could run to add all the pages to a database. Here is the config file i setup, it creates a couple of constants for our database file and our database table name and also creates a connection along with a function that can be run once to setup the initial tables.
define("DBNAME",'sitesearch.db');
define("TBL_PAGES", 'pages');
$conn = new PDO('sqlite:'.DBNAME);
#createTables();
function createTables()
{
global $conn;
$query = "CREATE TABLE ".TBL_PAGES."(
page_id INTEGER PRIMARY KEY,
page_file VARCHAR(255),
page_name VARCHAR(255),
page_description VARCHAR(255),
page_content longtext,
page_keywords text
)";
$conn->exec($query) or print_r($conn->errorInfo());
}
Here is the script to add all the pages to the database. This works by reading all the files in the current directory and checks that they php and are not directories or other undesirable files. Once it has the file it reads the file into a buffer and takes the variables $page_title, $meta_description and $meta_keywords from inside the file and uses them in fields in the database along with the contents.
include_once("config.php");
if ($handle = opendir('.')) {
while (false !== ($file = readdir($handle))) {
if ($file != "." && $file != ".." && $file != ".htaccess" && $file != "showdb.php" && $file != "search.php" && $file != "config.php" && $file != "header.php" && $file != "footer.php" && $file != "index_pages.php") {
if(get_file_extension($file) == "php")
{
insert_page($file);
}
}
}
closedir($handle);
}
function get_file_extension($file_name)
{
return substr(strrchr($file_name,'.'),1);
}
function insert_page($page_file)
{
global $conn;
$pg_name = "";
$pg_description = "";
$pg_contents = "";
$pg_keywords = "";
ob_start();
require_once($page_file);
$pg_name = $page_title;
$pg_description = $meta_description;
$pg_keywords = $meta_keywords;
$pg_contents = ob_get_contents();
ob_end_clean();
$query = "INSERT INTO ".TBL_PAGES."(page_file, page_name, page_description, page_content, page_keywords) VALUES (".$conn->quote($page_file).", ".$conn->quote($pg_name).", ".$conn->quote($pg_description).", ".$conn->quote($pg_contents).", ".$conn->quote($pg_keywords).")";
$conn->exec($query) or print_r($conn->errorInfo());
}
Once the scripts are setup and run we should have a database with all the pages in it. Now we need to create the search page to display this data. To do this i created a simple search form and then made use of the pear pager class to make the results span over multiple pages. The form simply takes the search value and does a LIKE mysql query to return the matching results.
<h1>Search</h1>
<form action="" method="get">
<input type="text" name="search" value=""/>
<input type="submit" name="submit" value="Search"/>
</form>
<div id="results">
<?php
require_once 'Pager/Pager.php';
isset($_REQUEST['search']) ? $search = $_REQUEST['search'] : $search = "";
$count = 0;
$search = cleanup($search);
If(!empty($search))
{
/* First we need to get the total rows in the table */
$qry = "SELECT * FROM ".TBL_PAGES." WHERE page_file LIKE '%".$search."%' OR page_name LIKE '%".$search."%'";
$result = $conn->query($qry);
if($result === false) {
print_r($conn->errorInfo());
}
$totrows = count($result->fetchAll());
$pager_options = array(
'mode' => 'Sliding',
'perPage' => 10,
'delta' => 4,
'totalItems' => $totrows,
);
$pager = Pager::factory($pager_options);
list($from, $to) = $pager->getOffsetByPageId();
$from = $from - 1;
$perPage = $pager_options['perPage'];
$query = $conn->query("SELECT * FROM ".TBL_PAGES." WHERE page_file LIKE '%".$search."%' OR page_name LIKE '%".$search."%' LIMIT ".$from." , ".$perPage.""); // buffered result set
echo '<ul id="searchresults" class="clearfix">';
if($totrows > 0)
{
foreach($query as $entry)
{
echo "<li>";
echo '<a href="'.lnk('/'.substr($entry['page_file'], 0 , -4).'/').'">'.$entry['page_name'].'</a>';
echo "<p>".$entry['page_keywords']."</p>";
echo "</li>";
}
} else {
echo "<li>There were no search results found please try another keyword or phrase</li>";
}
echo '</ul>';
if(isset($pager))
{
echo '<div class="pager">';
echo $pager->links;
echo '</div>';
}
}
?>
</div>
Its a pretty simple site search and was whipped up quickly to do a job and stay on target for the timescale. I’ve found this particularly useful when i don’t have access to a database on a website and its a simple html/php site i can throw this together and setup a nice little search engine in minutes. I’d love to see any feedback or suggestions for better alternatives.
Appreciating the time and effort you put into your blog and in depth information you provide. It’s great to come across a blog every once in a while that isn’t the same old rehashed material. Great read!
I’ve saved your site and I’m adding your RSS feeds to my Google account.