
UNDERGROUND

                            
                                  
                                    
                                     
                                                                 
    ˿   ˿ ˿ ˿    ˿ ˿ ˿ ˿ Ŀ
         ɼ      δ     δ    ˴
     ͼ                   
                                     BY
                                 ז
                          

With the positive response I recieved from my last article, I've decided to
continue the series in this issue of SYNCHRONETICS E-ZINE. Last article,
I discussed the most primitive form of hacking, guessing user account
passwords, otherwise known as the Brute-Force method. This would probably be
the most common form of hacking encountered on most systems. This issue, I'd
like to focus on some of the more esoteric, and more effective, forms of
hacking that you may encounter in your career as a SysOp.

Before I get started I would like to take a moment to reiterate the fact that
most hackers are just curious people who relish the kind of challenge that
system intrusion offers. We don't intend to harm the system or destroy system
data and resources. There is, however, a small minority of hackers who hack,
not for the challenge or the knowledge gained, but with a darker purpose in
mind. It is these immature, bed-wetting lamers that I endeavour to thwart.

The art of hacking is the art of understanding human nature and psychology.
And two seemingly universal constants of human nature are 1) Laziness, and
2) Trust. First, we'll examine laziness. Hackers understand most people
will try to get away with as little work as possible to accomplish a task.
You can call it whatever you want, laziness, procrastination, working
"smart", unwillingness to RTFM (*editor's note: Read The Fudging Manual),
etc.. I call it a window of opportunity not to be passed up. Many times,
when people set up complex systems, they will leave default passwords and
login accounts either from a lack of foresight, or from ignorance. Thus
making it easy as pie for hackers to get a copy of that system's manual
and take advantage of that laziness. So it behooves you to READ the manual
and make certain that no "loose-ends" such as those are available to make a
hacker's job easy (believe me, they'll thank you for it later).

Trust. Just saying it gives me the warm fuzzies. Everybody wants to be able
to trust his fellow man. It's just human nature. I mean, no one wants to be
thought of as a paranoid schizoid with a persecution complex, do they? So,
unless we have reason not to trust someone's intentions, we usually do.
Otherwise, we'd all sit in corner booths at restaurants with our backs to
the wall, right Yojimbo(*editor's note: Just because you're paranoid
doesn't mean someone's *not* out to get you)? Heheheheh... Long story. Well
anyway, as I was saying, we like to trust people and in turn, like to be
trusted. But when you're a system administrator, you can't afford such
warm and fuzzy luxuries. Caution should be the word for the day. And just
to prove my contention about trust, what if I were to say that one of
Yojimbo's command shells he created actually has a back door written into it
that will allow anyone to drop to DOS? How many of you looked at his .SRC
files before you set up the command shells on your system? Not too many. Just
as I suspected (*editor's note: There are NO back doors in any of my command
shells, HeadHunter was merely using that as a practical illustration). That's
what I'm talking about when I say trust (and I guess that's a good example
of laziness too!). Never take anything you put on your system at face value.

If user's are the weakest link in your system security chain, I'd have to
say external programs (i.e. DOORS) run a very close second. You never
really know how secure an external program is unless it's one you've
written yourself (and even then there could be bugs). Many hackers are also
proficient programmers, and a common tactic of the more creative among us
is to write a program with a built in backdoor to your system.

Of course, writing an entire external program may seem a little much, and
hackers being human, they are also bound by at least one of those universal
constants I was speaking of, laziness. So a common tactic is to write a
"patch" to an already existing program that either alters it's original
code to create a back door, or, just replaces the old executable with a new
one that has a back door already coded into it. That way, they can just
upload it to their favorite BBS (under a false name, of course) and see who
downloads it. And to lend credence to their bogus program, they'll usually
add in the usual .DOCs and legal disclosures that you might find in a real
program/archive/update from that particular company. Pretty sneaky.

So, you're probably wondering, how do you defend against that kind of
underhanded attack? Well, for one thing, a great number of programmers are
using PKZIP's authenticity verification option when they create archives to
be released to the general public. This allows then to basically add a
"signature" to the ZIP file that verifies that it came from that
programmer or company. Signatures can be forged, but it's not very easy.

Another option is to only get program updates/patches directly from the
company, either through their BBS or by U.S. Snail. But this brings up
another tactic of hackers, the old, "Program update through the mail",
trick. Instead of, or in addition to, uploading their bogus file to a BBS,
the hacker may, if he is targeting a particular system, send a very
official diskette containing an "official" program update, upgrade, or
patch. Now, this can get heavy since this could also constitute mail fraud,
which means most hackers will shy away from such tactics but if they're
stupid, or don't care about the feds, then they just may try it.

Well, that's it for another edition of HaCKeR'S oVeRHeaRD. I hope I've
enlightened some of you. See you next time in THe DRK CNTNNT......
