Showing posts with label Technology. Show all posts
Showing posts with label Technology. Show all posts

Saturday, July 30, 2016

Best Animation Software


Realism is the key for that rising demand with 3D animation. This is the challenging technology designed for prospective artists and for professional animators alike to cultivate new editions and versions in the existing 3D cartoon programs. The frames within the models and the shots are sometimes simple, but the action or the movements within the characters is the location where the creativity and skills of the animator comes inside.

MS SQL Server Hosting Benefits


MS SQL server is a computer application that offers advanced relational database management services to the enterprises. Few of the essential points that make it part of an advantageous Enterprise Resource Planning (ERP) proposition can be given as: 

PSD to HTML conversion is so important


Today PSD to HTML and PSD to XHTML play very significant and critical role in web design and web development. Success and failure of any website or web application can also sometimes depend on these basic PSD to XHTML/CSS conversions.

QuickBooks Hosting - A Superb Financial Solution


QuickBooks software is the most commonly used financial software in small to medium size business organizations. QuickBooks is a very powerful accounting software solution, and can work in a variety of industries and business situations.

Make Your Website More Panda Friendly


The recent Google Panda update has raised the bar not only for SEO experts but also for web designers. It’s getting tougher and tougher to rank well and that means that you really have to pay attention to all designing and Google ranking factors.

Putting KVM Switches In Your Data Center


As your company grows you realize that you need to process data faster so as to give your clients all the information they need in a timely fashion. This need often compels a company to obtain more computers and servers but which take up more of the company's available space and also take a toll on its present staff.

Cheap printer ink cartridges: helps in cost cutting


With the increase in the usage of printers in businesses and office, demand for printer ink cartridges has gone up as well. At the same time, quality of the pages and the ink cartridges has become the essential factors in printing the pages. Many a times, people lookout for the original equipment manufacturer (OEM) ink cartridges whenever their printers fall short of ink.

Business Catalyst CMS Templates

Business Catalyst CMS templates are the perfect way to customise your Adobe Business Catalyst based website. Adobe Business Catalyst is a powerful CMS that offers much more than the typical fare. As well as the ability to add and manage your own online content, it also enables you to view data on your visitors, manage customer relations, and conduct extensive email marketing campaigns all from a single, convenient online login.

Where To Buy Cheap WoW Gold


World of Warcraft Gold Sellers have increased their numbers as the third game expansion is about to come. More and more players are leveling alt characters and are planning to gear them up for raiding more than twice a week with different characters. Using Google on research about where to buy cheap wow gold, below is an overview of the top 5 results of sites offering WoW gold services from the search where to buy cheap WoW gold:

Is it time to step up to a full service VPS hosting option?


For anybody who is fed up with the inadequate overall performance of their shared web server but not yet in a position to step up to a costly dedicated server than the most suitable option available for you would be a Virtual Private Server (VPS). 

The U.S. Government: Paying to Undermine Internet Security, Not to Fix It




The Heartbleed computer security bug is many things: a catastrophic
tech failure, an open invitation to criminal hackers and yet another reason to upgrade our passwords on dozens of websites. But more than anything else, Heartbleed reveals our neglect of Internet security.

The United States spends more than $50 billion a year on spying and intelligence, while the folks who build important defense software — in this case a program called OpenSSL that ensures that your connection to a website is encrypted — are four core programmers, only one of whom calls it a full-time job.
In a typical year, the foundation that supports OpenSSL receives just $2,000 in donations. The programmers have to rely on consulting gigs to pay for their work. "There should be at least a half dozen full time OpenSSL team members, not just one, able to concentrate on the care and feeding of OpenSSL without having to hustle commercial work," says Steve Marquess, who raises money for the project.
Is it any wonder that this Heartbleed bug slipped through the cracks?
Dan Kaminsky, a security researcher who saved the Internet from a similarly fundamental flaw back in 2008, says that Heartbleed shows that it's time to get "serious about figuring out what software has become Critical Infrastructure to the global economy, and dedicating genuine resources to supporting that code."
The Obama Administration has said it is doing just that with its national cybersecurity initiative, which establishes guidelines for strengthening the defense of our technological infrastructure — but it does not provide funding for the implementation of those guidelines.
Instead, the National Security Agency, which has responsibility to protect U.S. infrastructure, has worked to weaken encryption standards. And so private websites — such as Facebook and Google, which were affected by Heartbleed — often use open-source tools such as OpenSSL, where the code is publicly available and can be verified to be free of NSA backdoors.
The federal government spent at least $65 billion between 2006 and 2012 to secure its own networks, according to a February report from the Senate Homeland Security and Government Affairs Committee. And many critical parts of the private sector — such as nuclear reactors and banking — follow sector-specific cybersecurity regulations.
But private industry has also failed to fund its critical tools. As cryptographer Matthew Green says, "Maybe in the midst of patching their servers, some of the big companies that use OpenSSL will think of tossing them some real no-strings-attached funding so they can keep doing their job."
In the meantime, the rest of us are left with the unfortunate job of changing all our passwords, which may have been stolen from websites that were using the broken encryption standard. It's unclear whether the bug was exploited by criminals or intelligence agencies. (The NSA says it didn't know about it.)
It's worth noting, however, that the risk of your passwords being stolen is still lower than the risk of your passwords being hacked from a website that failed to protect them properly. Criminals have so many ways to obtain your information these days — by sending you a fake email from your bank or hacking into a retailer's unguarded database — that it's unclear how many would have gone through the trouble of exploiting this encryption flaw.
The problem is that if your passwords were hacked by the Heartbleed bug, the hack would leave no trace. And so, unfortunately, it's still a good idea to assume that your passwords might have been stolen.
So, you need to change them. If you're like me, you have way too many passwords. So I suggest starting with the most important ones — your email passwords. Anyone who gains control of your email can click "forgot password" on your other accounts and get a new password emailed to them. As a result, email passwords are the key to the rest of your accounts. After email, I'd suggest changing banking and social media account passwords.
But before you change your passwords, you need to check if the website has patched their site. You can test whether a site has been patched by typing the URL here. (Look for the green highlighted " Now Safe" result.)
If the site has been patched, then change your password. If the site has not been patched, wait until it has been patched before you change your password.
A reminder about how to make passwords: Forget all the password advice you've been given about using symbols and not writing down your passwords. There are only two things that matter: Don't reuse passwords across websites and the longer the password, the better.
suggest using password management software, such as 1Password or LastPass, to generate the vast majority of your passwords. And for email, banking and your password to your password manager, I suggest a method of picking random words from the Dictionary called Diceware. If that seems too hard, just make your password super long — at least 30 or 40 characters long, if possible.
Source: Original Article from ProPublica.org

Cheat Sheet: Behind The U.S. Cyberattacks on Iran


This morning, The New York Times published a report detailing how the Bush and Obama administrations created the cyberweapon known as Stuxnet and used it to disrupt Iran’s uranium enrichment program.
Much has been written about Stuxnet, which, as ProPublica recently reported, remains a threat beyond Iran. But the Times account, based on interviews with unnamed U.S. and Israeli officials, is the most extensive account to date of U.S. cyberwarfare capabilities. Here’s our cheat sheet on what’s new and the fallout:
  • Because of Stuxnet’s complexity, cybersecurity analysts have long suspected it was a U.S.-Israeli effort. The Times story confirms this for the first time, disclosing that the project was code-named “Olympic Games.”
  • Olympic Games began under the Bush administration, and during development, it was known as “the bug.”
  • President Obama has repeatedly expressed concern that if the U.S. acknowledges it is behind Stuxnet, it would give terrorists and enemy states a justification for their own attacks.
  • Stuxnet was introduced into Iran's enrichment facility at Natanz by an unwitting Iranian. “It turns out there is always an idiot around who doesn’t think much about the thumb drive in their hand," a source told the Times.
  • To test the bug in secret Department of Energy labs, the U.S. used aging centrifuges handed over in 2003 by Libyan dictator Col. Muammar el-Qaddafi, making them into replicas of the nuclear enrichment facilities Iran used.
  • The attack on Iran became the first known instance of the U.S. using computer code to physically damage another country’s infrastructure. Obama, the Times writes, “was acutely aware that with every attack he was pushing the United States into new territory, much as his predecessors had with the first use of atomic weapons in the 1940s, of intercontinental missiles in the 1950s and of drones in the past decade.”
  • The Israeli role in the attack came from a military unit called Unit 8200 that had “technical expertise that rivaled” the U.S. National Security Agency’s as well as significant intelligence about Iran’s nuclear facilities.
  • When a programming error made Stuxnet’s code public in 2010, Obama considered halting Olympic Games altogether. But in the end, the administration decided to accelerate the attacks.
  • It’s unclear who was responsible for the programming error, but some in the Obama administration blamed the Israelis. The Times names Vice President Joe Biden:  “Mr. Biden fumed. ‘It’s got to be the Israelis,’ he said. ‘They went too far.’ ”
  • American officials claim that Flame, an even more complex piece of computer malware that has also attacked Iranian infrastructure, is not part of Olympic Games — but they didn’t explicitly deny it was an American project.
  • Opinion is divided as to whether Olympic Games was successful in slowing uranium enrichment in Iran. Administration officials said they had set the Iranians back 18 months to two years, but other experts say enrichment levels quickly recovered and that Iran today has enough fuel for five or more weapons with additional enrichment.
The Obama administration has long emphasized the importance of domestic cybersecurity, but recent statements show an increasing openness about offensive capabilities. Secretary of State Hillary Clinton acknowledged last month that government hackers had attacked Al Qaeda propaganda sites in Yemen, changing information in ads that talked about killing Americans to show how many Yemenis had died in Al Qaeda attacks. 
For years, the Iranians had no idea they were being attacked, blaming their own workers or faults in their facilities, The Times said. But because Stuxnet was inadvertently released, any government— not to mention any hacker with spare time and a malicious streak — can create their own mutation of the weapon.
As the Times points out, “No country’s infrastructure is more dependent on computer systems, and thus more vulnerable to attack, than that of the United States.” Siemens makes specialized industrial controllers that were targeted by the Olympic Games attacks. As Siemens confirmed to ProPublica, the same hardware and software holes Stuxnet took advantage of in Iran exist in thousands of locations in the U.S. and worldwide. The vulnerable equipment controls everything from natural gas pipelines to refineries and power transmission lines.  
American cybersecurity experts have long warned that it’s only a matter of time before someone turns an equally destructive cyberweapon on our own systems. Now that Stuxnet’s origins are clear, the odds of that happening might be even higher.
Contributing: Peter Maass of ProPublica
Source: Original Article from ProPublica.org

Too Human (Not) to Fail

This story was co-published with Source.
A coffee grinder that only works when the lid is on. An electrical plug that only fits into an outlet one way. Fire doors that stay unlocked in an emergency.

Lots of everyday objects are designed to prevent errors — saving clumsy and forgetful humans from our own mistakes or protecting us from worst-case scenarios. Sometimes designers make it impossible for us to mess up, other times they build in a backup plan for when we inevitably do. But regardless, the solution is baked right into the design.
This concept has a lot of names: defensive design, foolproof, mistake proof, fail-safe. None is as delightful as the Japanese poka-yoke.
The idea of the poka-yoke (which means literally, “avoiding mistakes”) is to design something in such a way that you couldn’t mess it up even if you tried. For example, most standard USB cables can only be plugged into a computer the correct way. Not to say you would never attempt to plug it in upside down, but if you do, it simply won’t fit. On the other hand, it’s easy to reverse the + and - ends of a battery when you replace them in your TV remote. The remote’s design provides other clues about the correct way to insert the batteries (like icons), but it’s still physically possible to mess it up. Not so with the USB cable. It only fits one way, by design.
Many consumer coffee grinders are another example of a design that physically prevents you from messing up. Even if you wanted to, you could NOT chop your fingers on the blade, because the “on” switch for the grinder is triggered by closing the lid (as opposed to a blender, which leaves its blades easily accessible to stray fingers).
The humble coffee grinder that only works when it’s closed. Source: arvind grover via Flickr
Foolproof design can also save your life. The mechanical diver’s watch is designed with a bezel that spins in only one direction. It functions as a simple timer that a diver can use to know how much oxygen is left in the tank.
In a blog post about resilient design, designer Steven Hoober describes how this smart design can prevent disaster:
If the ring were to get bumped, changing its setting, having it show less time might be inconvenient, but its going the other way and showing that you have more time than you do might kill you. You don’t even need to know how it works. It just works.
The diver watch will never show you more time than you actually have left underwater. Source: Naka7a via Flickr
Foolproof measures can be found throughout web design (although perhaps without the life-saving part). Ever fill out an online form incorrectly and only found out because you could not progress to the next step? That’s a conscious decision by a designer to prevent an error. In this case, from Yahoo, it’s even a chance to insert a little humor:
Yahoo’s humorous design prevents you from being born in the future. Source: UXmas
Sometimes, design cannot prevent you from messing up (we humans somehow always figure out a way to do things wrong). But it can still make it harder for you to do the wrong thing. This type of design is not exactly foolproof — more like fool-resistant.
Child-resistant safety caps on medicine bottles, for example, keep kids from accidentally overdosing. A water dispenser that makes you push an extra button or pull up a lever to dispense hot water makes it harder for you to accidentally scald yourself. Neither of these designs are as foolproof as the coffee grinder. But they do put an additional step between you (or your child) and disaster.
We see these features quite often on our computers. Most of us are familiar with the “Are you sure?” messages before you empty the Trash or the “Do you want to…” before you replace a file with another one by the same name. These alerts certainly don’t prevent us from making a mistake (in fact, we probably ignore them most of the time), but their purpose is to slow us down.
 
These pop-up messages put a small step between you and the loss of precious files.
Designers have also come up with more elaborate confirmation steps. For instance, Gmail will detect whether you’ve used the word “attached” in an email you’ve written and, if you try to send it without an attachment, will ask you if you meant to include one.Github, a popular website used by software developers to collaborate on code, forces you to type the full name of the project in order to delete it.
Github makes it harder for you to accidentally delete your projects, by design.
Most of these examples work by forcing your attention to the task at hand, breaking your autopilot behavior and make you really consider what you are about to do. Design details don’t make it impossible to screw up, but they certainly make it a little bit harder.
Still other designs revolve around keeping your information secure. Your computer may prompt you for a login if you’ve left it idle for a few minutes, preventing someone else from seeing or stealing sensitive information. Smartphones often do the same thing, requiring a passcode to re-enter. Some web browsers will prevent you from downloading certain files, and your computer’s operating system may ask you if you are SURE you want to open a program you got from the internet. Connect a smartphone to a new computer and it may ask you to confirm that this computer can be trusted. These security measures don’t prevent you from doing dangerous things, but try to prevent a potential horrible outcome due to careless mistakes.
Let’s say it’s too late to prevent the error: the mistake has occurred, the failure has happened. What now? This is where fail-safe design comes in. Fail-safe design prevents failure from becoming absolute catastrophe.
In some cases, it’s the system (or environment) that has failed. In the event of a fire, fire doors are required by law to fail unlocked, so that people can escape a burning building. On the other hand, if you need to protect state secrets or cash in a bank vault, you’d probably want a fail-secure system for those doors, which would fail locked.
Circuit breakers cut the power if an electrical current gets too high. Elevators have brakes and other fail-safe systems that engage if the cable breaks or power goes out, keeping the elevator from plummeting to its passengers’ death.
In other instances, it’s our own human error that the fail-safe system is designed for.SawStop is a table-saw safety technology that automatically shuts off a spinning saw blade if it comes in contact with flesh. The blade has a sensor that can detect whether it’s a piece of wood or your finger, using the same property (electrical conductivity) that makes a touch screen sensitive to your bare fingers but not to your gloves. In less than one thousandth of a second, the saw blade will shut off, giving you in the worst case only a small nick (rather than removing your thumb). Don’t believe this could work so fast?Watch this video:
SawStop, a table saw brake technology that can help you keep your fingers. Source: SawStop
Some industrial paper cutters are designed to shut off if they detect motion nearby (presumably a hand getting too close to the blade). Similarly, many automatic garage doors will stop closing if they sense something, or someone, in the way.
Another well-known fail-safe measure is the dead man’s switch. The dead man’s switch kicks in when a human in charge lets go of the controls — or dies, as the name implies. In the event of an accident (say, a train operator has a heart attack), the dead man’s switch can prevent harm to all the passengers by stopping the train.
This actually happened a few years ago on the New York City subway, when an MTA employee had a heart attack on the G train. His hands lost grip of the controls, the brakes were activated, and the train slowed to a stop.
The dead man’s switch is also a common device in lawn mowers and other equipment that require you to continually hold down a lever or handle to operate. As soon as you let go, the motor stops. U.S. law actually specifies that all walk-behind lawn mowers come equipped with such a switch that stops the blade within 3 seconds of a user releasing her grip.
In software, absolute catastrophe often means losing your work, your files, that long heartfelt email you worked so hard on. So many fail-safe designs revolve around letting you undo actions or automatically saving work in the background as you go along. Auto-saving Google Docs are a vast improvement over other word-processing programs that can lose hours of work with a single crash or loss of power. Web browsers like Chrome can restore all your tabs if you accidentally close a window (even if you’d rather declare tab bankruptcy).
Finally, we have the last-ditch, eleventh-hour design solution to keep you safe from the worst of the worst: The backup.
A backup parachute is perhaps the most dramatic of all backup devices, but many things in the real world are designed to have similar built-in redundancies. Cars have two sets of brake circuits (not to mention a spare tire). Airplanes have multiple redundant control systems. Emergency stairwells have lights that work on battery power if the building’s electricity goes out. On computers, backing up your photos or making a copy of a file before editing it is just common sense.
Backup parachutes: don’t leave home without ‘em. Source: mopteek via Flickr
In the end, nothing humans build or even touch will ever be free from error. Luckily, designers work tirelessly to save us from our mistakes. And in many cases, we don’t have to know how the poka-yoke works. It just works.
Source: Original Article from ProPublica.org

A More Secure and Anonymous ProPublica Using Tor Hidden Services


Update, January 15: Our configuration has been updated; the walkthrough now notes that you can use Unix sockets for HiddenServicePort.
There’s a new way to browse our website more securely and anonymously. To do it, you’ll need a bit of software called the Tor Browser. Once you’ve got it installed, copy and paste this URL into the running Tor browser: http://www.propub3r6espa33w.onion/

This is called a “Tor hidden service.” Tor is a network of internet relays (and a web browser that uses the network) that protects your privacy by hiding your browsing habits from your internet service provider, and hiding your IP address from the websites you visit. A Tor hidden service is a special type of website that can only be visited with Tor, masking your digital trail as much as possible (Disclosure: Outside of my work at ProPublica, I’m also the developer of Onion Browser, an unofficial Tor Browser for iOS.)
We launched this in part because we do a lot of reporting, writing, and coding about issues like media censorshipdigital privacy and surveillance, and breaches of private medical information. Readers use our interactive databases to see data that reveals a lot about themselves, such as whether their doctor receives payments from drug companies. Our readers should never need to worry that somebody else is watching what they’re doing on our site. So we made our site available as a Tor hidden service to give readers a way to browse our site while leaving behind less of a digital trail.
We actually launched it quietly as an experiment last year, shortly after publishing Inside the Firewall, an interactive news application about online media censorship in China. While we’re not aware of any countries currently blocking access to ProPublica, I was curious to see what we could do to improve access to readers if that ever happens.
While using our Tor hidden service greatly increases your privacy, it’s important to note that it is, for the most part, the same website people see on the regular Internet. Like all websites, ours contains embedded multimedia and code from external services like Google Analytics, Facebook “Like” buttons, etc. – important tools that help us engage with our audience and quantify how well we’re doing. (Our privacy policy outlines some of the things we use.) While we are still thinking through how to handle these things in our hidden service, the Tor Browser does obscure the identifying metadata that these external services can see, like your IP address, your location, and details about the computer you are using. And if you want to maximize your anonymity by blocking those external services, it’s easy to do yourself in the Tor Browser by increasing the “security level” to “high.”

About Tor & Hidden Services

A Tor hidden service (sometimes called an “onion site” or an “onion service”) has a special domain name that ends in .onion – like propub3r6espa33w.onion – that you can only connect to using Tor. These special websites and services use strong encryption (even if the URL doesn’t start with https), mask metadata like the IP address of the user, and even mask the address of the site they’re visiting.
Collectively, sites like these are often referred to as being part of the “dark web” though the term is contentious in the Tor developer community, thanks to its association with sites like Silk Road, an illicit online drug market that was seized by the FBI in 2013. But regardless of how it’s misused, the dark web has legitimate and even critical utility in keeping the Internet safe and private:
  • ProPublica and several other journalism and human rights organizations useSecureDrop to allow sources and whistleblowers to safely transmit sensitive files.
  • The email hosting service Riseup uses onion services to allow users to access their email ultra-securely.
  • The chat program Ricochet uses onion services under the hood to allow users to securely chat with each other without relying on any central servers.
  • Facebook launched an onion site in 2014 to improve access to Facebook over Tor, with an eye toward privacy-conscious users and those in countries where Facebook is blocked.

How is a Hidden Service Different?

You are probably already used to using a secure browser when browsing many sites, especially when banking or shopping. Your web browser lets you know when a site uses “HTTPS” by displaying a lock in the address bar. How is a Tor hidden service different, and why is it more secure than HTTPS?

Browsing Normally

When you’re on a site that uses HTTPS encryption, the connection between your web browser and the site is secure, but important metadata is still visible to servers that can observe your connection, like your ISP or the wifi router you use at a coffee shop. Those can know, for instance, what sites you visit, and can see any unencrypted images and scripts that get loaded in your browser.

Browsing Normal Sites Using Tor

Tor provides some anonymity by relaying traffic through different servers as you browse. This makes it seem to a web server that you are coming from somewhere else. Tor picks three random relays and routes your traffic through each. No relay gets the “whole picture” (that you are visiting propublica.org), because Tor encrypts your connection three times before sending it out: the first layer can only be decoded by the first relay, the second layer can only be decoded by the second, and so on.
Think of this way of layering encryption through relays as like the layers of an onion. Hence the name “Tor,” which was originally an acronym for “the onion router.” Though the longer name has fallen out of use, the onion metaphor is still pretty common when discussing Tor and software that uses it.
When you’re browsing using a Tor browser, your ISP only knows you are using Tor, not what sites you’re visiting or what you’re doing, even when you’re connecting to a non-HTTPS site. The first relay knows your actual IP address and ISP, and knows the address of the second relay. The second relay knows about the first relay and third relay but can’t unencrypt your data to see what you’re doing. The third relay (or “exit relay”) knows about the second relay and the site you are going to, and it can see any unencrypted data that you’re browsing. It’s possible for the sites you visit to know that you’re using Tor because the list of exit nodes is openly known, but they have no way of knowing your real IP address.
Although the exit relay that sends your Tor connection to the normal internet does not know your IP address (since your connection was forwarded to it by another relay), it has access to your metadata, like which sites you are visiting, and unencrypted data because it needs to know how to pass your request on to a desired website.

Browsing Onion Sites Using Tor

An onion site uses the encrypted and anonymous Tor connection from your computer all the way to the websites you visit. Just as before, Tor picks three random relays, but in this case, a copy of Tor we’re running also picks three random relays and the relays meet in the middle.
Your ISP knows you are using Tor. As before, the first relay knows that you are a Tor user, knows your IP address and ISP, and knows the address of the second relay. The chain of relays, which know only the connections before and after them, continue as before except now there are six of them.
As in normal Tor use, none of the relays between a user and the website see the “whole picture”. But the onion site connection never has to leave those confines to connect to the normal Internet, which exposes metadata. To a relay, both a user and our website look like normal Tor clients, and no relay knows any more than that.
More technical detail, such as how the two chains know how to meet, can be found hereand in the Tor design paper.

How to Run Your Own Hidden Service

If you run a website and want to run a Tor hidden service, here’s how. I’ll assume from here on out that you’ve got a fair bit of technical knowledge and understand how to run a web server, use the command line and edit configuration files. The Windows command-line “expert” version of Tor is a bit finicky and I don’t have a lot of experience using it, so for now, these instructions will be Mac OS X and Linux-specific. (Are you a Windows Tor expert and interested in helping me write Windows-specific sections of these docs? Please get in touch!)
The following instructions will help you set up a demonstration onion site on your own computer. For a production site, there are a few other things you’ll want to consider that I’ll discuss toward the end.

Step 1: Install Tor

First, you’ll want to install a command-line version of Tor.
The easiest way to do this is to use your package manager to install Tor — Homebrew on Mac, apt-get or yum or whichever manager you use on Linux. The invocation is usually something like brew install tor or apt-get install tor.
If you’re on Mac OS X, you will be prompted to optionally run several commands to have launchd start tor at login, after installing. You can skip this for now.

Step 2: Configure a Tor Hidden Service

Once installed, edit Tor’s configuration file. For OS X Homebrew, you’ll want to create the file at /usr/local/etc/tor/torrc. For Linux, you’ll generally find that the configuration already exists at /etc/tor/torrc. Open that in your code editor of choice.
Add two lines like this:
HiddenServiceDir /tmp/test-onion-config
HiddenServicePort 80 127.0.0.1:3000
  • HiddenServiceDir: The directory containing test-onion-config needs to exist and needs to be owned by the user running Tor. On Mac OS X, Homebrew will install & launch Tor as your current user, so using /tmp/ is fine (since this is just a test demonstration). If you wish to configure an onion site in OS X that won’t disappear when rebooting, you can use something in your home directory, like /Users/<your_username>/Code/my-onionsite-config. On a Linux machine, you can use something like /var/run/tor/test-onion-config; in Ubuntu, /var/run/tor is already owned by the debian-tor user that runs the Tor daemon.
  • HiddenServicePort: This routes the inbound port at the xxxxxxxxxxxxxxxx.onion to an IP address and port of your choice. (It should be an IP address and it’s recommended that this route to the local machine. It can also be a Unix socket on the local filesystem.) In the HiddenServicePort 80 127.0.0.1:3000 example, a user accessing your http://xxxxxxxxxxxxxxxx.onion/ (implied port 80) would serve the web app running at port 3000 on your computer.
    Unless your underlying website uses some authentication, a Tor hidden service configured like this will be readable by anybody who knows the address. So if you don’t want to test this with a web app you already have on your computer, you can create a simple test by doing:
    $ mkdir /tmp/test-onion-content
    $ cd /tmp/test-onion-content
    $ python -m SimpleHTTPServer 3000
    This will serve the contents of /tmp/test-onion-content at 127.0.0.1:3000, and also at the onion site address being configured.
(You can check out the Tor manual for more information about torrc config lines.)

Step 3: Access the Tor hidden service

If you aren’t running your test app on port 3000 (or whichever you chose) yet, do that now. Then start (or restart) Tor:
On Mac OS X, just run tor in a terminal window. If you previously installed Tor with Homebrew and followed the steps to copy plist files to have launchd start tor at login, you can run the following to restart it:
$ launchctl unload ~/Library/LaunchAgents/homebrew.mxcl.tor.plist
$ launchctl load ~/Library/LaunchAgents/homebrew.mxcl.tor.plist
Depending on your flavor of Linux, you’ll need to do one of the following (or something similar):
$ sudo service tor restart
# or
$ sudo systemctl restart tor.service
If all went well, Tor should be running. (If running on the terminal, you’ll see Bootstrapped 100%: Done at some point. Otherwise, you can usually see the status by looking at the Tor log file — /var/log/tor/log, depending on your flavor of Linux.)
Now you’ll find that the HiddenServiceDirectory (i.e., /var/run/tor/test-onion-config or /tmp/test-onion-config) has been created.
Inside that directory, you’ll see two files: hostname and private_key.
If you open the hostname file, you will see that it contains an .onion address. If you open Tor Browser and try to visit it, you should now see your website.

More Tor Hidden Services

The torrc file can contain more than one hidden service, and hidden services can also operate on several ports. For example, Facebook’s facebookcorewwwi.onion listens on both HTTP (port 80) and HTTPS (port 443). In cases like this, a torrc file will look something like this:
HiddenServiceDir /var/run/tor/main-onion-site
HiddenServicePort 80 127.0.0.1:80
HiddenServicePort 3000 127.0.0.1:3000

HiddenServiceDir /var/run/tor/other-onion-site
HiddenServicePort 80 127.0.0.1:9000
In this case, the “main” site will serve two ports: http://xxxxxxxxxxxxxxxx.onion/ and http://xxxxxxxxxxxxxxxx.onion:3000/ (routing to what is running on ports 80 and 3000 locally). The “other” site will be available at http://yyyyyyyyyyyyyyyy.onion/ (routing to what is being served at port 9000 locally).
A little-known secret is that you can also use subdomains with onion sites: the web server that listens to connections from the HiddenServicePort just needs to respond to the hostname. This works because recent versions of Tor will handle a connection to www.xxxxxxxxxxxxxxxx.onion as a connection to xxxxxxxxxxxxxxxx.onion, and your browser will state the subdomain it wants as part of the request inside that connection. You can see an example onion site subdomain configuration here.

Custom Hidden Service Names

You may have noticed that we didn’t configure the onion name that served our example site. Given a HiddenServiceDir without a private_key file inside, Tor will randomly generate a private_key and hostname. The 16 characters of the hostname before .onion are actually derived from this key, which allows Tor to confirm that it is connected to the right hidden service.
There are a few tools that allow you to generate a private_key in advance to get a predictable name: Shallot and Scallion are two popular options. Given an existing torrc with an already-configured HiddenServiceDir, you can delete the existing hostname file, drop in the new private_key file, and restart Tor to use your new onion domain.
It’s debatable whether this is a good idea, since it may train users to look for the prefix and ignore the rest of the domain name. For example, an evildoer can generate a lot of propubxxxxxxxxxx.onion domains — how do you know you’re at the right one? Facebook works around this issue by having an SSL certificate for their hidden service to provide a strong signal to users that they’re at the correct onion site. We’re working on adding this to our onion site, too.
When in doubt, a user should try to confirm an onion site’s domain by corroborating it at a variety of sources. (You can find a GPG-signed file confirming our onion addresses,here.)

Running in Production

There are a few extra things to think about when running a Tor hidden service in production:
  • While you can run your hidden service on your laptop or workstation, you’ll likely want to use an always-on machine to act as a server. Otherwise, the hidden service goes offline when your computer does.
  • Your HiddenServiceDir should be relatively well-protected. If someone else can see your private_key, they can impersonate your hidden service. On a Linux machine, this is done by making sure that only the user running tor (debian-tor on Ubuntu) can access this directory.
  • The target of your HiddenServicePort should preferably be on the same machine as the web server. While you can map this to any IP address, terminating this connection locally reduces the chances of leaking metadata.
  • You may want to consider where your hidden service is installed. To avoid leaking traffic metadata as much as possible, you can choose to put the hidden service on the same machine as your website so that no Tor traffic has to leave the machine to access the website. Our hidden service is currently hosted on a machine located at the ProPublica offices (and not at our website hosting provider); this is mostly to help us debug issues, but also has the benefit of keeping full control of the machine hosting the hidden service and related log files. In terms of traffic metadata, this mixes encrypted HTTPS traffic from the hidden service with encrypted HTTPS traffic from our own use of the website. I think that’s an acceptable tradeoff (versus leaving hidden service logs available to our web host), but we may re-examine this in the future.
  • If you want to mirror an existing website, it‘s worth taking stock of the assets and resources that get loaded on your pages. We’ve made an effort to provide onion services for several subdomains that we use to serve our own assets. But news organizations also publish items that use external media — audio clips, videos, social media posts — to strengthen a story, and we use analytics to measure and understand our audiences. Having these external resources has ramifications (since some of the traffic no longer uses a hidden service and relies on using Tor to access the resource’s normal site) and it’s worth considering this issue on your own site. (As mentioned near the top of this post, there are features in Tor Browser that allow a user to block many of these resources.)
  • Current versions of Tor don’t provide any way of load-balancing large amounts of traffic. So even if you host your content with a production web server such as Apacheor nginx, the hidden service endpoint is a single point of failure that can’t currently be scaled up. But Tor developers are working on a way of fixing this in an upcoming version.
  • SSL certificates are more difficult to acquire for Tor hidden services than normal domains. They can be issued, but must be extended validation (“EV” or “green bar”) certificates and undergo more thorough verification than proving ownership of a normal domain name. (We plan to go through this process, and we’ll update this post as we do so.)

Our Hidden Service Mirror

Putting together all of the above, you can get something like our current hidden service.
We use local Unix sockets (instead of an ip:port) for the local connection between Tor and nginx. Other than that, there’s nothing too special about our torrc, which you can find here.
The hidden service running ProPublica’s site at propub3r6espa33w.onion speaks to an instance of nginx that handles routing our subdomains and some processing — such as rewriting “www.propublica.org” links to instead use the onion domain — before proxying on to our normal web server. (In Ubuntu, you can install a version of nginx that contains the extra rewrite modules by installing the nginx-extras package.) You can see this configuration here.
You might notice that our hidden service does experimentally listen to HTTPS connections, but we’re currently using a self-signed key for that, which can cause a combination of browser errors and assets not loading if you try to visit our onion site that way. We’re working on getting a valid SSL certificate for our hidden service, and that should hopefully be fixed sometime soon.
If you have any concerns or feedback about this tutorial or the configuration I’ve shared, please get in touch. (My PGP key is 0x6E0E9923 and you can get it herehere, onKeybase and on most keyservers.)
Source: Original Article from ProPublica.org