Two popular Android apps from Chinese tech giant Baidu were temporarily unavailable on the Google Play Store in October after they were caught collecting sensitive user details.
The two apps in question—Baidu Maps and Baidu Search Box—were found to collect device identifiers, such as the International Mobile Subscriber Identity (IMSI) number or MAC address, without users’ knowledge, thus making them potentially trackable online.
The discovery was made by network security firm Palo Alto Networks, who notified both Baidu and Google of their findings, after which the search company pulled the apps on October 28, citing “unspecified violations.”
As of writing, a compliant version of Baidu Search Box has been restored to the Play Store on November 19, while Baidu Maps remains unavailable until the unresolved issues highlighted by Google are fixed.
A separate app named Homestyler was also found to collect private information from users’ Android devices.
According to Palo Alto researchers, the full list of data collected by the apps include:
Phone model
Screen resolution
Phone MAC address
Carrier (Telecom Provider)
Network (Wi-Fi, 2G, 3G, 4G, 5G)
Android ID
IMSI number
International Mobile Equipment Identity (IMEI) number
Using a machine learning-based algorithm designed to detect anomalous spyware traffic, the origin of the data leak was traced to Baidu’s Push SDK as well as ShareSDK from the Chinese vendor MobTech, the latter of which supports 37,500 apps, including more than 40 social media platforms.
While Google has taken steps to secure the Play store and stop the malicious activity, bad actors are still finding ways to infiltrate the app marketplace and leverage the platform for their gain.
Indeed, an academic study published by researchers from NortonLifeLock earlier this month found the Play Store to be the primary source of malware installs (about 67.5%) on Android devices based on an analysis of app installations on 12 million handsets over a four-month period between June and September 2019, fueled in part due to the wide popularity of the platform.
However, its vector detection ratio — the ratio of unwanted apps installed through that vector overall apps installed through that vector — was found to be only 0.6% when compared to alternative third-party app stores (3.2%).
“Thus, the Play market defenses against unwanted apps work, but still significant amounts of unwanted apps are able to bypass them, making it the main distribution vector for unwanted apps,” the researchers said.
If anything, the incident is yet another reminder that no app, even if developed by a legitimate third-party, can be taken for granted.
This also means the usual safeguards such as scrutinizing app reviews, developer details, and the list of requested permissions may not offer enough protection, thus making it difficult to ascertain if a permission is misused by cybercriminals to steal private data.
“In mobile devices, it is typical to ask a user to grant a list of permissions upon installation of an application or to prompt a user to allow or deny a permission while the application is running,” Palo Alto researchers concluded.
“Disallowing permissions can often result in a non-working application, which leads to a bad user experience and might tempt a user to click on ‘allow’ just to be able to use an application. Even if a certain permission is granted, it is often up to the app developers whether it is used in accordance with the official guidelines.”
cPanel, a provider of popular administrative tools to manage web hosting, has patched a security vulnerability that could have allowed remote attackers with access to valid credentials to bypass two-factor authentication (2FA) protection on an account.
The issue, tracked as “SEC-575” and discovered by researchers from Digital Defense, has been remedied by the company in versions 11.92.0.2, 11.90.0.17, and 11.86.0.32 of the software.
cPanel and WHM (Web Host Manager) offers a Linux-based control panel for users to handle website and server management, including tasks such as adding sub-domains and performing system and control panel maintenance. To date, over 70 million domains have been launched on servers using cPanel’s software suite.
The issue stemmed from a lack of rate-limiting during 2FA during logins, thus making it possible for a malicious party to repeatedly submit 2FA codes using a brute-force approach and circumvent the authentication check.
Digital Defense researchers said an attack of this kind could be accomplished in minutes.
“The two-factor authentication cPanel Security Policy did not prevent an attacker from repeatedly submitting two-factor authentication codes,” cPanel said in its advisory. “This allowed an attacker to bypass the two-factor authentication check using brute-force techniques.”
The company has now addressed the flaw by adding a rate limit check to its cPHulk brute-force protection service, causing a failed validation of the 2FA code to be treated as a failed login.
This is not the first time the absence of rate-limiting has posed a serious security concern.
Back in July, video conferencing app Zoom fixed a security loophole that could have allowed potential attackers to crack the numeric passcode used to secure private meetings on the platform and snoop on participants.
It’s recommended that cPanel customers apply the patches to mitigate the risk associated with the flaw.
Networking solutions provider Belden Inc. has been hacked and employee and company data stolen.
Described Tuesday by the company as a “data incident involving unauthorized access” and a “sophisticated attack by a party outside the company,” the data theft is said to involve the hackers gaining access to a limited number of company file services.
According to a statement from the company reported today by Security Week, the stolen data may have contained names, birthdates, government-issued identification numbers, bank account information, home addresses, email addresses and other employment information. The limited company information stolen is said to involve details of business partners, including bank account numbers and taxpayer I.D. numbers.
Belden has gone through the typical tick box of standard responses: activating its cybersecurity response plan, deploying teams of internet information technology specialists, hiring third-party forensic cybersecurity experts and informing regulatory officials and law enforcement.
“Safety is always paramount at Belden and we take threats to the privacy of personal and company information very seriously,” said Belden Chief Executive Roel Vestjens. “We regret any complications or inconvenience this incident may have caused and are offering assistance to those individuals who may have been impacted.” That assistance includes offering free credit monitoring services.
Exactly when the hack took place and what it involved was not shared by the company.
“A consistent theme in recent security breaches is that cybercriminals only need to find and exploit the weakest links in order to cause significant damage,” Chris Clements, vice president of solutions architecture at cybersecurity company Cerberus Cyber Sentinel Corp. told SiliconANGLE. “Poor password hygiene, employees falling victim to phishing or VPN appliances that aren’t included in the regular organization patch cadence are all low-hanging fruit for cybercriminals to target for exploitation.”
Clements said attackers thrive on those things that are missed or orphaned. “The only strategy to ensure that an organization stays as protected as possible is to adopt a culture of security that is first in the minds of all employee personnel from executive leadership to line of business operations,” he said.
The mention of VPN appliances that aren’t updated being targeted comes as a hacker has published a list of credentials for nearly 50,000 Fortinet Inc. FortiGate virtual private networking systems connected to the internet. In that case, hackers exploited a known vulnerability that has had a patch available since May 2019, but some users have not applied it.
A hacker has published a list of credentials for nearly 50,000 Fortinet Inc. FortiGate virtual private networking systems connected to the internet that can be exploited using a known vulnerability.
The 6.7-gigabyte uncompressed database is being offered on popular hacking forums and is claimed to be “the most complete achieve containing all exploit links and sslvpn websession files with username and passwords.” The person offering the database, using the name arendee2018, also claims the database contains links and all web sessions files from the Fortinet devices.
The data had its origins to data stolen on Nov. 19 by a hacker going by the name “pumpedkicks” who published a list of one-line exploits for Fortinet FortiGate IPs containing a vulnerability classified as CVE-2018-13379, HackRead reported. The new published database has used the published exploits to compile credentials and other related data.
The vulnerability was uncovered by researchers in Taiwan in August 2018 and is described as a “path traversal vulnerability in the FortiOS SSL VPN web portal [that] may allow an unauthenticated attacker to download FortiOS system files through specially crafted HTTP resource requests.” Fortinet then issued a patch for the vulnerability in May 2019 and warned customers of the need to apply the patch again in August 2019 and July. Unfortunately, not all companies and users regularly apply security updates leaving themselves vulnerable to hacking.
In July Fortinet warned that advanced persistent threat groups — including APT 29, also known as Cozy Bear — were using the vulnerability to target COVID-19 vaccine development in Canada, the U.S. and the U.K. The warning that the vulnerability was being exploited to target COVID-19 research was also made by U.K. National Cyber Security Center and Canada’s Communications Security Establishment with support from the U.S. Department of Homeland Security’s Cybersecurity and Infrastructure Agency July 16.
All Fortinet customers are advised, if they haven’t done so already to immediately upgrade all FortiGate systems to the latest firmware releases and to validate that all SSL-VPN local users are expected, with correct email addresses assigned and to perform a password reset on all users.
“In this incident, the exploitation of the specific CVE allowed an unauthenticated attacker to download system files through uniquely crafted HTTP resource requests,” Vinay Sridhara, chief technology officer of security posture transformation firm Balbix Inc., told SiliconANGLE. “By using special elements such as ‘..’ and ‘/’ separators, attackers can get around the restricted location to access files or directories that are elsewhere on the system.”
Sridhara added that about 50,000 records belonging to banks, telecoms and government organizations were exposed by this data leak, including session-related information and plain-text usernames and passwords of Fortinet VPN users. “What’s most concerning is that even if the vulnerability is patched, the credentials are still at risk for credential stuffing attacks,” he said.
The malware takes advantage of how the Windows command line interpreter works to try and slip past anti-detection tools, Huntress Labs says.
Researchers at Huntress Labs have uncovered what they described as a really clever use of Windows batch scripting by the authors of Trickbot to try and sneak the latest version of their malware past automated detection tools.
The technique takes advantage of the way the Windows command line interpreter, cmd [.] exe, reads and interprets data that is typed on the command line.
What the authors of Trickbot have done is use a batch script to break up their payload into numerous small chunks and then use the command line interpreter to rebuild the original payload, says John Hammond, senior security researcher at Huntress.
“The gist of this technique is substituting each character in a payload with a new mapped value, so the payload can be slowly created with building blocks.”
This technique isn’t specific to Trickbot. In fact, any other code or malware sample can do this within Windows batch scripting, Hammond says. But this is the first time that Huntress has observed a threat actor using this exact obfuscation technique, he says. “It seems to be a very simple technique, and now that Trickbot has introduced it, it may become more popular.”
Though PowerShell and other command line tools are now available for Windows, cmd [.] exe remains the default command line interpreter for the operating system, as it has for decades. The technology makes a good target for attackers because it provides an interactive interface that they can use to execute commands, run malicious programs, delete files, and carry out a variety of other actions.
Typically, the command prompt can be hardened and locked down with application whitelisting or more secure configuration settings, Hammond notes. “But if a threat actor can access it, it is really a high-value target,” because of the extent of control that it allows an attacker to establish over the operation system.
According to Huntress, the batch script that Trickbot authors have used to obfuscate their payload looks like a whole of lot of garbled code with random letters and weird percent signs scattered all over. But a closer examination of the code showed that it is designed to create small, one-letter or two-character variable values that are small chunks of the final payload. Though the code might look completely unintelligible, cmd [.] exe interprets it and executes it.
Troublingly, to an automated scanner, the multiple, smaller chunks would not appear like any malicious strings, so the malware would be able to evade detection. “This obfuscated loader primarily evades signature-based detection — which hunts for known bad strings or characters that can indicate malicious activity,” Hammond says.
For organizations, the main takeaway is that automated tools don’t guarantee protection against all malware threats. In this case, the obfuscation that the authors of Trickbot employed would have been relatively easy to spot for human analysts. But a scanner might not see any evidence of malicious or bad commands and let the malware slip past, Hammond says.
“A real person might look at this batch script and say, ‘oh, if you just add this command right here, it spits out the payload and tells you what exactly it is trying to do,'” he says. “Automated solutions do not and cannot think to do that.”
The FBI is warning internet users to be on high alert for website and email domains masquerading as those of the crime-fighting agency.
The Bureau claimed in a Public Service Announcement that it has detected multiple threat actors registering fake domains mimicking legitimate FBI ones, which could be the precursor to a new campaign.
Cyber-criminals typically register domains that look identical to those of their victims, but which contain very small differences, such as an alternative TLD after the dot, or a slightly different spelling. Internationalized Domain Names (IDNs) also offer opportunities to use Cyrillic and other letters that look very similar to Roman alphabet characters.
Internet users could visit such sites of their own accord or be prompted to do so via phishing emails which also use spoofed domains to appear more trustworthy.
“Spoofed domains and email accounts are leveraged by foreign actors and cyber-criminals and can easily be mistaken for legitimate websites or emails,” the noticed warned.
“Adversaries can use spoofed domains and email accounts to disseminate false information; gather valid usernames, passwords, and email addresses; collect personally identifiable information and spread malware, leading to further compromises and potential financial losses.”
The Feds urged members of the public to ensure web and email addresses are correctly spelled, and that operating systems, computer software and anti-malware tools are all up-to-date.
It recommended users to disable Macros, and to never open unsolicited emails or attachments, or provide personal information to the sender.
Tim Helming, security evangelist at DomainTools, argued that part of being security aware is becoming familiar with common abuse patterns.
“In this case, many of the illegitimate domains use various other words in conjunction with ‘fbi,’ which is a common practice by malicious actors. However, since legitimate organizations do own variations on their own domain names, internet users also need to consider the context of any link they are presented with,” he added.
“For example, if a link referring to the FBI (or other government agency) arrives as an unsolicited text message, there is a high likelihood of fraud. When in doubt, users should type the simplest version of the domain name (such as fbi.gov) into the browser, and navigate around the site to find the content they seek.”
A third of people surveyed said they have lost more than £1,000 to fraudsters.
More than half of Brits (56%) say they haven’t done anything to ward off potential scams, despite a third (66%) saying they believe COVID-19 has led to more opportunities for fraudsters.
According to research conducted by mortgage loan company Ocean Finance, there were 1,467,962 instances of fraud reported between 2018-2020 with a 12% rise in the past year alone. These figures were taken from an FOI submitted to the National Fraud Authority.
Ocean Finance also found from a poll of 1,000 people that a third of people that disclosed the amount of money that had been lost to fraudsters said it topped £1,000 ($1,335).
As evidence mounts that many of us are being targeted by scammers, less than 10% of Brits (8%) think enough is being done to stop financial Fraud.
Alongside this, 54% say they don’t even trust that a scammer would be caught if the incident were reported.
While some are able to quantify what they have lost through fraud, an eight of those surveyed weren’t even sure if they had been a victim of a scam.
Here are some commonly used scams:
Impersonation Scams: One of the more common scams, this involves a fraudster getting in touch and pretending to be from a trusted source (eg the police or your bank) to persuade you to hand over details. UK Finance has reported that, following an 84% rise from the previous year, there were 15,000 cases reported January to June 2020 with an estimated £58m lost.
Spear Phishing: A more targeted version of wider “impersonation scams,” this usually involves emails which contain a certain level of information you wouldn’t expect fraudsters to have, lowering your guard and making you more likely to share personal data.
Smishing: Like “phishing” but using SMS as opposed to emails (eg fraudulent messages from your bank asking for your logins to ‘unlock’ your account).
New Account Fraud (NAF): A scam where fraudsters use your details to open a new bank account, with the objective being to max out the credit limit as soon as possible before vanishing. This often leaves the victim caught up in legal and financial trouble whilst the case is resolved.
Boiler Room Scams: A classic “too good to be true” scam where victims are cold-called about a sure-fire way to make money quickly (e.g. stocks or investments) and are pressured into making an on the spot decision about transferring money. In 2018, an FCA investigation led to the arrest of 5 people who’d conned £2.8m from unsuspecting members of the public.
Friday Afternoon Fraud: A more specific yet widely reported form of fraud where scammers target the email addresses of solicitors (usually on a Friday due to higher rates of mortgage fee transactions on that day) to trick them into sending buyers’ money to alternative accounts. This can lead to individual losses in the tens of thousands and huge disruption in property chains.
The Apache module mod_rewrite allows you to rewrite URL requests that come into your server and is based on a regular-expression parser. The examples presented here show how to:
Direct requests for one subdirectory to a different subdirectory or the primary directory (document root) Example: http://example.com/folder1/ becomes http://example.com/folder2/ or just http://example.com/.
Direct requests to a subdirectory Example: http://example.com/file.html becomes http://example.com/folder1/file.html.
Add www to every request Example: http://example.com becomes http://www.example.com. Or, convert http:// to https://.
Convert URL to all lowercase using Rewrite Map Example: YourDomaIn.com/recIpeS.html becomes yourdomain.com/recipes This will help prevent typos from producing http errors.
mod_rewrite
When implemented correctly, mod_rewrite is very powerful. There are many other applications for mod_rewrite that you can learn about at apache.org. Please reference their website for other possible rewrite scenarios.
These examples are provided as a courtesy – (mt) Media Temple does not design custom rewrite rules for individual customer websites.
READ ME FIRST
Advanced Support can help!
If you’re having trouble with the steps in this article, additional assistance is available via our free Support, our dedicated team of infosec specialists. For more information on what we can do for you, please click here.
Requirements
Before you start, please have handy:
FTP user credentials
INSTRUCTIONS
Create a plain text .htaccess file (click the link for details on this type of file), or add the lines from the example to the top of your existing .htaccess file.
Add the lines from the appropriate example to your file. Note that you should replace example text with your own information. Replace example.com with your own domain, folder1 with your own folder name, file.html with your own file name, etc. Save your changes.
Use or to upload the file to the document root of the appropriate domain. If your domain is example.com, you should upload the file to:
/var/www/vhosts/example.com/httpdocs/
That’s it! Once you’ve uploaded the file, the rewrite rule should take effect immediately.
Some Content Management Systems (CMSs), like WordPress for example, overwrite .htaccess files with their own settings. In that case, you may need to figure out a way to do your rewrite from within the CMS.
Direct requests for one subdirectory to a different subdirectory or the document root
http://example.com/folder1/ becomes http://example.com/folder2/ or just http://example.com/.
domains/example.com/html/folder2/ must exist and have content in it for this to work..htaccess
This .htaccess file will redirect http://example.com/folder1/ to http://example.com/folder2/. Choose this version if you don’t have the same file structure in both directories:
Filename: .htaccess
Options +FollowSymLinks
RewriteEngine On
RewriteRule ^folder1.*$ http://example.com/folder2/ [R=301,L]
This .htaccess file will redirect http://example.com/folder1/ to plain http://example.com/. Choose this version if you want people redirected to your home page, not whatever individual page in the old folder they originally requested:
Filename: .htaccess.
Options +FollowSymLinks
RewriteEngine On
RewriteRule ^folder1.*$ http://example.com/ [R=301,L]
This .htaccess file will redirect http://example.com/folder1/file.html to http://example.com/folder2/file.html. Choose this version if your content is duplicated in both directories:
File name: .htaccess
Options +FollowSymLinks
RewriteEngine On
RewriteRule ^folder1/(.*)$ http://gs.mt-example.com/folder2/$1 [R=301,L]
Test
Upload this file to folder2 (if you followed the first or third example) or your html folder (if you followed the second example) with FTP:
Filename: index.html
<html>
<body>
Mod_rewrite is working!
</body>
</html>
Then, if you followed the first or second example, visit http://example.com/folder1/ in your browser. You should see the URL change to http://example.com/folder2/ or http://example.com/ and the test page content.
If you followed the third example, visit http://example.com/folder1/index.html. You should be redirected to http://example.com/folder2/index.html and see the test page content.
Code explanation
Options +FollowSymLinks is an Apache directive, prerequisite for mod_rewrite.
RewriteEngine On enables mod_rewrite.
RewriteRule defines a particular rule.
The first string of characters after RewriteRule defines what the original URL looks like. There’s a more detailed explanation of the special characters at the end of this article.
The second string after RewriteRule defines the new URL. This is in relation to the document root (html) directory. / means the html directory itself, and subfolders can also be specified.
$1 at the end matches the part in parentheses () from the first string. Basically, this makes sure that sub-pages get redirected to the same sub-page and not the main page. Leave it out to redirect to the main page. (It is left out in the first two examples for this reason. If you don’t have the same content in the new directory that you had in the old directory, leave this out.)
[R=301,L] – this performs a 301 redirect and also stops any later rewrite rules from affecting this URL (a good idea to add after the last rule). It’s on the same line as RewriteRule, at the end.
Note: The directory folder1 must be unique in the URL. It won’t work for http://example.com/folder1/folder1.html. The directory folder1 must exist and have content in it.
.HTACCESS
This .htaccess file will redirect http://example.com/file.html to http://example.com/folder1/file.html:
<html>
<body>
Mod_rewrite is working!
</body>
</html>
Then, visit http://example.com/ in your browser. You should see the URL change to http://example.com/folder1/ and the test page content.Code explanation
Options +FollowSymLinks is an Apache directive, prerequisite for mod_rewrite.
RewriteEngine On enables mod_rewrite.
RewriteCond %{HTTP_HOST} shows which URLs we do and don’t want to run through the rewrite.
In this case, we want to match example.com.
! means “not.” We don’t want to rewrite a URL that already includes folder1, because then it would keep getting folder1 added, and it would become an infinitely long URL.
[NC] matches both upper- and lower-case versions of the URL.
RewriteRule defines a particular rule.
The first string of characters after RewriteRule defines what the original URL looks like. There’s a more detailed explanation of the special characters at the end of this article.
The second string after RewriteRule defines the new URL. This is in relation to the document root (html) directory. / means the html directory itself, and subfolders can also be specified.
$1 at the end matches the part in parentheses () from the first string. Basically, this makes sure that sub-pages get redirected to the same sub-page and not the main page. Leave it out to redirect to the main page of the subdirectory.
[R=301,L] – this performs a 301 redirect and also stops any later rewrite rules from affecting this URL (a good idea to add after the last rule). It’s on the same line as RewriteRule, at the end.
This .htaccess file will redirect http://example.com/ to http://www.example.com/. It will also work if an individual file is requested, such as http://example.com/file.html:
This .htaccess file will redirect http://example.com/ to https://example.com/. It will also work if an individual file is requested, such as http://example.com/file.html:
Filename: .htaccess
RewriteEngine On
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://www.example.com/$1 [R,L]
Test
Visit http://example.com in your browser. You should see that the same page is displayed, but the URL has changed to http://www.example.com (first example) or https://example.com (second example).
Also, http://example.com/file.html will become http://www.example.com/file.html or https://example.com/file.html.Code explanation
Options +FollowSymLinks is an Apache directive, prerequisite for mod_rewrite.
RewriteEngine On enables mod_rewrite.
RewriteCond %{HTTP_HOST} shows which URLs we do and don’t want to run through the rewrite.
In this case, we want to match anything that starts with example.com.
[NC] matches both upper- and lower-case versions of the URL.
RewriteRule defines a particular rule.
The first string of characters after RewriteRule defines what the original URL looks like. There’s a more detailed explanation of the special characters at the end of this article.
The second string after RewriteRule defines the new URL. This is in relation to the document root (html) directory. / means the html directory itself, and subfolders can also be specified.
$1 at the end matches the part in parentheses () from the first string. Basically, this makes sure that sub-pages get redirected to the same sub-page and not the main page.
[R=301,L] – this performs a 301 redirect and also stops any later rewrite rules from affecting this URL (a good idea to add after the last rule). It’s on the same line as RewriteRule, at the end.
CONVERT URL TO ALL LOWERCASE USING REWRITE MAP
This .htaccess rule will make sure that all characters entered into a url are converted to lowercase. This helps prevents errors caused by typos.
Note: Because this rule requires an edit to a server level configuration file, Grid and Managed WordPress users will not be able to implement this rule.
In order for this to work properly, you must also add a directive to your vhost file (httpd.conf):
For Plesk: Navigate to Domains > example.com > Web Hosting Settings > Additional Apache Directives, and place the above code.
Next, open your .htaccess and add the following lines:
RewriteEngine On
RewriteCond %{REQUEST_URI} [A-Z]
RewriteRule . ${lc:%{REQUEST_URI}} [R=301,L]
Note: Instead of using RewriteMap to convert URLs to lowercase, it is recommended by Apache that mod_spelling be used to ignore case sensitivities.
Test
Navigate to your domain using a combination of uppercase and lowercase letters.
Code Explanation
RewriteEngine On enables mod_rewrite.
RewriteCond %{REQUEST_URI} [A-Z] – Grabs the entered address.
RewriteRule . ${lc:%{REQUEST_URI}} – Uses the ‘lc’ variable that was added to the vhost file to convert all characters to lowercase.
[R=301,L] – Performs a 301 redirect and also stops any later rewrite rules from affecting this URL (a good idea to add after the last rule). It’s on the same line as RewriteRule, at the end.
Regular expressions
Rewrite rules often contain symbols that make a regular expression (regex). This is how the server knows exactly how you want your URL changed. However, regular expressions can be tricky to decipher at first glance. Here’s some common elements you will see in your rewrite rules, along with some specific examples.
^ begins the line to match.
$ ends the line to match.
So, ^folder1$ matches folder1 exactly.
. stands for “any non-whitespace character” (example: a, B, 3).
* means that the previous character can be matched zero or more times.
So, ^uploads.*$ matches uploads2009, uploads2010, etc.
^.*$ means “match anything and everything.” This is useful if you don’t know what your users might type for the URL.
() designates which portion to preserve for use again in the $1 variable in the second string. This is useful for handling requests for particular files that should be the same in the old and new versions of the URL.
Examine the new URL in your browser closely. Does it match a file that exists on the server in the new location specified by the rewrite rule? You may have to make your rewrite rule more broad (you may be able to remove the $1 from the second string). This will direct rewrites to the main index page given in the second string. Or, you may need to copy files from your old location to the new location.
If the URL is just plain wrong (like http://example.com/folder1//file.html – note the two /s) you will need to re-examine your syntax. (mt) Media Temple does not support syntax troubleshooting.
INFINITE URL, TIMEOUT, REDIRECT LOOP
If you notice that your URL is ridiculously long, that your page never loads, or that your browser gives you an error message about redirecting, you likely have conflicting redirects in place.
You should check your entire .htaccess file for rewrite rules that might match other rewrite rules. You may also need to check .htaccess files in subdirectories. Note that FTP will not show .htaccess files unless you have enabled the option to view hidden files and folders. See our .htaccess article for details.
Also, it’s possible to include redirects inside HTML and PHP pages. Check the page you were testing for its own redirects.
Adding [L] after a rewrite rule can help in some cases, because that tells the server to stop trying to rewrite a URL after it has applied that rule.
Among the many various tools for customizing your web server, the .htaccess config file is a tremendous asset. You can quickly reset document types, parsing engines, URL redirects, and many other crucial features. Webmasters who are not very technical may not get into the specifics of managing your own .htaccess file. But the topic itself is fascinating and worth some investigation.
For this article I want to present some of the more purposeful concepts for webmasters and web developers. Anybody who is launching their own website on an Apache server will definitely want to understand how to manage their .htaccess file. It provides so much customizability and it can work across any web languages from PHP to Ruby.
At the bottom of this post I have added some external webapps to help newcomers generate their .htaccess files dynamically.
Why use an .htaccess File?
This is a great question and perhaps we should start by answering “what is an .htaccess file”? It is a very special configuration file used by the Apache web server. An .htaccess file can tell the web server how to present various forms of information and how to handle various HTTP request headers.
Really it is a means of decentralization to organize web server settings. One physical server may hold 50 different websites each with their own .htaccess file. It grants a lot of power to webmasters, which would otherwise be impossible. But why should you use one?
The biggest reason is security. You can lock out certain directories or make them password protected. This is great for private projects or new Content Management Systems where you want a little extra security. But there are also common tasks like redirecting 404 error messages to a certain webpage. This only takes a single line of code and it can dramatically impact how visitors react to missing pages.
Truthfully there is not much I can say to convince others that an .htaccess file is worth understanding. Once you see it in action then you can recognize all of the value which comes from this tiny config file. Also I hope the rest of this article may present some insightful topics to bring webmasters into the light of managing an .htaccess configuration.
Allow/Deny Access
It is possible to recognize potential spam visitors and deny them access to your website. This can be a little extreme, however if you know that a person or group of people have been targeting your website there are some options to choose from. You could pick a domain referral to deny or ban visitors by an IP address.
order allow,deny
deny from 255.0.0.0
deny from 123.45.6.
allow from all
These sample codes were copied from Htaccess Guide as they are the perfect template for getting started. Notice the 2nd IP address is missing the 4th integer. This code block will target the first IP(255.0.0.0) and every IP within the range of 123.45.6.0-255, then allow all other traffic. Webmasters may not use this as frequently as other techniques but it is helpful to understand.
Prevent Directory Listing
There will be times when you have an open directory which is set up to allow browsing by default. This means users can view all the files listed inside an internal directory structure, like your images folder. Some webmasters do not want to allow directory listing and thankfully the code snippet is pretty easy to remember.
1
Options -Indexes
I have seen this answer presented countless times throughout Stack Overflow and it may be one of the easiest .htaccess rules to remember.
It is possible to actually create multiple .htaccess files inside each of these directories so maybe one of them is password protected but the others are not. And you can still keep the Options -Indexes so that visitors cannot browse through your website /images/ folder.
Password Protection
Password-protecting your directories is a very common procedure for securing administration areas and other folders crucial to your website. Sometimes you will only want to offer access to a small group of people. Other times passwords are to prevent hackers from gaining access into your website administration panel. But either way it is a very powerful solution to a whole number of problems.
There is a handy guide on password protection which outlines the important code snippets. You will need to generate a password file which stores the username/password credentials. This is how Apache can check against what the user inputs to see if they should be granted access. And notice how you will need to generate a sample for your username and password.
I would recommend using this htpassword generator so you can save a bit of time. The syntax will always come out perfect and you do not need to encrypt the password yourself. And the other great option is to password protect an entire directory listing. We can see this example in CSS-Tricks code snippets gallery.
AuthType Basic
AuthName "This Area is Password Protected"
AuthUserFile /full/path/to/.htpasswd
Require valid-user
Security for WordPress
To put this password-protection idea to good use, let’s display a real-world example. This more complicated code snippet will force user authentication for anybody accessing the WordPress wp-login.php file. You will find the original source on Ask Apache which has numerous other WordPress protection snippets.
<Files wp-login.php>
Order Deny,Allow
Deny from All
Satisfy Any
AuthName "Protected By AskApache"
AuthUserFile /web/askapache.com/.htpasswda1
AuthType Basic
Require valid-user
</Files>
And if you are going to follow these .htaccess rules it might also help to password protect the admin area. Typically the wp-login.php file is going to get the most hits from people attempting to brute force their way into your system. So even just the sample codes above would be more than enough added security for your WordPress website.
HTTP URL Rewrite Rules
Rewriting URLs is probably one of the most common uses for .htaccess files. WordPress default installations can actually generate an .htaccess file right from the administration panel. This allows you to create pretty URLs which do not have the .php?p=1 structure.
I want to look at this rewrite example on how to update underscores to dashes since it contains a lot of the most important elements.
RewriteEngine and RewriteBase can most always be set to these exact values. But you need the RewriteEngine turned on for anything else to work. There are plenty of guides online explaining how to enable mod_rewrite and your hosting provider can also help.
Notice the syntax follows a pattern of RewriteRules at the top. These rules are used to match against cases that are being sent as an HTTP request. These are answered by a RewriteRule which in this case redirects everything to the domain d.com. The ending brackets like [R=301,L] are called rewrite flags which are important, but more of an advanced topic.
If you want to delve a bit deeper you can find a long list of flags on this cheatsheet webpage.
The mod_rewrite syntax is definitely a little confusing but don’t be intimidated! The snippets can look a lot easier in other examples.
When just getting started, I have to recommend this mod_rewrite webapp that helps you generate code samples using real URLs. This is a brilliant tool because you can look up various items in the syntax to see what they actually do in the Rewrite rules. Here is another great tutorial with a simpler example to study:
RewriteRule ^dir/([0-9]+)/?$ /index.php?id=$1 [L]
Don’t try to overload yourself on these all at once. It took me well over 3-4 months to really start understanding how to rewrite URLs with [0-9a-zA-Z]+ and similar patterns. Keep on practicing and in time I promise you will get this stuff like it’s common-sense knowledge.
Code Snippets for Webmasters
I love easy-to-use snippets and I want to put together this small collection of pertinent .htaccess codes for webmasters. Each of these ideas can fit nicely into your own .htaccess file along with other code blocks. Most of these snippets are great for solving quick problems or fixes in your web server environment. Imagine the perfect Apache setup for brand new webmasters just getting started online.
Setting DirectoryIndex
The command for DirectoryIndex is used commonly in a single line. You can tell Apache which documents should be initially treated as the “main” document. By default this will target items such as index.html, index.php, index.asp, and other index files. But using this code snippet which I’ve copied below, you have the ability to make this root document anything you like.
irectoryIndex index.html index.cgi index.php
The order of documents should start with the most important and move through the ranks to the least important. So if we do not have an HTML or CGI file then the fallback will go to index.php. And you could even name these files home.php or someotherfile.php and it is all valid syntax.
Force WWW or Non-WWW Subdomain
Google can work with both versions of your website domain if you do not specify www.domain.com or just domain.com. In my experience it is best practice to choose one of these and set it as the only choice via .htaccess. Then Google will not index various URLs with some pointing to the WWW subdomain while others do not.
# Force WWW Subdomain
RewriteEngine On
RewriteCond %{HTTP_HOST} ^domain.com [NC]
RewriteRule ^(.*)$ http://www.domain.com/$1 [L,R=301]
# No Subdomain
RewriteEngine On
RewriteCond %{HTTP_HOST} !^domain.com$ [NC]
RewriteRule ^(.*)$ http://domain.com/$1 [L,R=301]
This code snippet comes from a CSS-Tricks archive and provides a very handy solution. You should update the domain to be whatever you need for your own website. Otherwise there will be problems and you’ll notice right away! But I do highly support forcing one of these two options and it is at the top of my tasks list after launching a new website.
Force Media File Downloads
Another fairly important snippet allows forcing certain media types to download instead of being displayed in the browser. Immediately I can think of PDF documents and MP3 audio files which may be presented in a downloadable format, but how do you make sure they are downloadable? I found a similar article published on Htaccess Guide which outlines this code snippet.
AddType application/octet-stream .zip .mp3 .mp4
Feel free to include even more filetypes at the end of this line. All of the media formats using the octet-stream MIME type will be downloadable. Forcing this through .htaccess is a very direct route to ensure people are not able to view these files in the browser.
Custom Error Documents
One last final piece I want to add is a full template of custom error documents. Usually these number codes are only seen on the server end. But there are plenty of these error documents which you should be familiar with. A few examples might be 403/404 errors and the 301 redirect.
This error code template starts at 100 and moves upwards into 500 errors. Please note that you obviously do not need all of these. Only the most common errors would be necessary, and possibly a few obscure snippets if you feel the need.
There are so many countless resources online discussing .htaccess files. My linked articles and webapps are a great place to get started. But keep practicing new ideas and don’t be afraid of testing out code snippets. As long as you have a backup file then you can test out anything you like and it is a fun learning experience.
We use cookies on our website to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept”, you consent to the use of the cookies. Cookie & Privacy Policy
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.