diff --git a/htroot/IndexCreate_p.html b/htroot/IndexCreate_p.html
index f6e45061a..5d5b348d8 100644
--- a/htroot/IndexCreate_p.html
+++ b/htroot/IndexCreate_p.html
@@ -73,7 +73,7 @@ You can define URLs as start points for Web page crawling and start crawling her
This message will appear in the 'Other Peer Crawl Start' table of other peers.
- If checked, the crawler will contact other peers and use them as remote indexers for your crawl. .
+ If checked, the crawler will contact other peers and use them as remote indexers for your crawl.
If you need your crawling results locally, you should switch this off.
Only senior and principal peers can initiate or receive remote crawls.
A YaCyNews message will be created to inform all peers about a global crawl, so they can omit starting a crawl with the same start point.
@@ -112,18 +112,18 @@ You can define URLs as start points for Web page crawling and start crawling her
From File:
-
+
From URL:
-
+
Existing start URLs are re-crawled.
- Other already visited URLs are sorted out as 'double'.
+ Other already visited URLs are sorted out as "double".
A complete re-crawl will be available soon.
-
+
@@ -228,7 +228,7 @@ Continue crawling.
Start URL
Depth
Filter
-
Accept '?' URLs
+
Accept "?" URLs
Fill Proxy Cache
Local Indexing
Remote Indexing
@@ -315,4 +315,4 @@ No remote crawl peers availible.
#%env/templates/footer.template%#