Monday, December 8, 2014

Mounting HFS+ MAC filesystem into Redhat Linux 6


Mounting HFS+ filesystem into Redhat Linux 6

# rpm --import http://elrepo.org/RPM-GPG-KEY-elrepo.org

#  rpm -Uvh http://www.elrepo.org/elrepo-release-6-6.el6.elrepo.noarch.rpm   
    Check (http://elrepo.org/tiki/tiki-index.php) for your version of redhat/centos

# yum install kmod-hfsplus

Connect the drive, it should automatically mounted or mount using following commands:

# fdisk -l

Note down device name e.g /dev/sdb1 with correct size (if 1 TB hard disk, it may show 1000GB)

# mkdir /Externaldrive

# mount /dev/sdb1 /Externaldrive

# cd /Externaldrive

# ls  //to list content of the drive

Tuesday, October 14, 2014

Installing Zabbix - Network Monitoring System

ZABBIX Network Monitoring System

Red Hat Enterprise Linux / CentOS

Supported for versions: RHEL 5, RHEL 6, Oracle Linux 5, Oracle Linux 6, CentOS 5, CentOS 6

Installing repository configuration package

Install the repository configuration package. This package contains yum configuration files.
Zabbix 2.2 for RHEL5, Oracle Linux 5, CentOS 5:
# rpm -ivh http://repo.zabbix.com/zabbix/2.2/rhel/5/x86_64/zabbix-release-2.2-1.el5.noarch.rpm
Zabbix 2.2 for RHEL6, Oracle Linux 6, CentOS 6:
# rpm -ivh http://repo.zabbix.com/zabbix/2.2/rhel/6/x86_64/zabbix-release-2.2-1.el6.noarch.rpm

Installing Zabbix packages

Install Zabbix packages. Example for Zabbix server and web frontend with mysql database.
Note: Zabbix official repository provides fping, iksemel, libssh2 packages as well. These packages are located in the non-supported directory.
# yum install zabbix-server-mysql zabbix-web-mysql
Example for installing Zabbix agent only.
# yum install zabbix-agent

Creating initial database

Create zabbix database and user on MySQL.
# mysql -uroot
mysql> create database zabbix character set utf8 collate utf8_bin;
mysql> grant all privileges on zabbix.* to zabbix@localhost identified by 'zabbix';
mysql> exit
Import initial schema and data.
# cd /usr/share/doc/zabbix-server-mysql-2.2.0/create
# mysql -uroot zabbix < schema.sql
# mysql -uroot zabbix < images.sql
# mysql -uroot zabbix < data.sql

Starting Zabbix server process

Edit database configuration in zabbix_server.conf
# vi /etc/zabbix/zabbix_server.conf
DBHost=localhost
DBName=zabbix
DBUser=zabbix
DBPassword=zabbix
Start Zabbix server process.
# service zabbix-server start

Editing PHP configuration for Zabbix frontend

Apache configuration file for Zabbix frontend is located in /etc/httpd/conf.d/zabbix.conf. Some PHP settings are already configured.
php_value max_execution_time 300
php_value memory_limit 128M
php_value post_max_size 16M
php_value upload_max_filesize 2M
php_value max_input_time 300
# php_value date.timezone Europe/Riga
It's necessary to uncomment the “date.timezone” setting and set the right timezone for you. After changing the configuration file restart the apache web server.
# service httpd restart
# sestatus
# setenforce 0
To change selinux mode to permissive. (Otherwise it wont let you grab info from zabbix server).
Zabbix frontend is available at http://localhost/zabbix in the browser. Default username/password is Admin/zabbix.

Tuesday, October 7, 2014

BPDU Guard vs BPDU Filter


In a stunning moment of clarity I figured out the two. It did take far longer that what was required but I feel now I can tick these two technologies off as being understood why you would use them and when you would use them.
Bridge Protocol Data Unit’s known also as BPDU’s play a fundamental part in a spanning-tree topology. No matter your flavour you will have BPDU’s.
BPDU – A quick breakdown
BPDU’s are sent out by a switch to exchange information about bridge ID’s and cost’s of the root path. A switch will use it’s MAC address and sent it to the STP multicast address of 01:80:c2:00:00:00. There are Configuration BPDU’s, Topology Change Notification BPDU’s and Topology Change Notification Acknowledgement BPDU’s. Exchanged at a frequency of every 2 seconds by default, BPDU’s allow switches to keep a track of network changes and when to block or forward ports to ensure a loop free topology.
BPDU Guard
BPDU Guard is designed to protect your switching network. Remember that a Port-fast port is designed to be connected to a device where BPDU’s aren’t expected. This could be a end user device, server or access-point.  When an unexpected BPDU is detected (an end-user wants to plug in a switch in his cubicle) the port will shutdown and enter a err-disable state.
When enabled globally this is a fantastic solution to protecting port-fast ports on access switches where you don’t expect a switch to be plugged in. BPDU guard when enabled on a per port interface, is conditional. It requires the port to be portfast enabled. If you require BPDU guard to be enabled unconditionally then you must do that on the port itself.
Global
Interface
BPDU Filter
Initially I was stumped as to why you would use this. Why on earth would you want to stop BPDU’s from being sent or received on a port. I immediate though it was ludicrous. It wasn’t until I had a discussion with the man of infinite wisdom @networkjanitor (Kurt Bales) did I understand it’s use. The point of demarcation is a fantastic place to use BPDU filter. When an ISP hands off a tail in the DC from their switch infrastructure, neither party want’s anything to do with the others STP topology. This one of the uses of this feature. Probably the best one I have found.
First of all, BPDU filter disables spanning-tree on a port period. It does this by restricting sending and receiving BPDU’s. Simple enough. When enabled on a global level, BPDU filter will apply to all portfast ports. When a port links up it will transmit some BPDU’s out before the port starts to filter BPDUs.
Remember that if a BPDU is received on a portfast interface, the interface will lose portfast status and because BPDU filtering relies on this it will become disabled.
Global
Interface

Anthony’s Wrap
I’ve used BPDU guard a whole lot. After learning at college you could bring down an entire block of lab’s with a switch configured a certain way, I made sure that no network under my jurisdiction would suffer the same fate. Couple BPDU guard with err-disable recovery and you have protection. BPDU filter could also be placed on access layer ports too. Another way to negate pesky attacks from inquisitive minds.
 source: http://networkinferno.net/clarity-bpdu-guard-vs-bpdu-filter

Renewing an IIS 7 SSL Certificate

Renewing an IIS 7 SSL Certificate

If you are renewing your GeoTrust SSL certificate running on Microsoft Internet Information Services (IIS) 7, you will need to perform some simple tasks from your IIS 7 web server before placing an order to renew your expring SSL certificatate.

Generate Renewal Certificate Request File (CSR)
  1. Open the Internet Information Services (IIS) Manager. From the Start button select Programs > Administrative Tools > Internet Information Services Manager.
  2. In the IIS Manager, select the main server node on the top left under Connections
  3. In the Features pane (the middle pane), double-click the Server Certificates option located under the IIS or Security heading (depending on your current group-by view).
  4. URGENT!! There is a known bug in IIS7 when using the "Renew" link to renew your SSL certificate. Please do not use the "Renew" link.
    From the Actions pane on the top right, select Create Certificate Request (DO NOT SELECT THE RENEW LINK). The Distinguished Name Properties dialog box opens. 
    .Iis_7_do_not_renew
  5. You will be asked for several pieces of info which will be used by GeoTrust to create your new SSL certificate. These fields include the Common Name (aka domain, FQDN), organization, country, key bit length, etc. Use the CSR Legend in the right-hand column of this page to guide you when asked for this information. The following characters should not be used when typing in your CSR input: < > ~ ! @ # $ % ^ / \ ( ) ? , &
  6. THIS IS THE MOST IMPORTANT STEP! Enter your site's Common Name. The Common Name is the fully-qualified-domain name for your web site or mail server. What ever your end-user will see in their browser's address bar is what you should put in here. Do not include http:// nor https://. Refer to the CSR legend in the right-hand column of this page for examples. If this is wrong, your certificate will not work properly.
  7. Enter your Organization (e.g., Gotham Books Inc) and Organizational Unit (e.g., Internet Sales). Click Next.
  8. Enter the rest of the fields using the CSR Legend on the right right-hand column of this page for guidance and examples.
  9. Click Next to continue.
  10. The next screen of the wizard asks you to choose cryptography options. The default Microsoft RSA SChannel Cryptography Provider is fine and a key bit-length of 2048.
  11. Click Next to continue.
  12. Finally, specify a file name for the certificate request. It doesn't matter what you call it or where you save it as long as you know where to find it. You'll need it in the next step. We recommend calling it certreq.txt.
  13. Click Finish to complete the certificate request (CSR) Wizard.
  14. Now, from a simple text editor such as Notepad (do not use Word), open the CSR file you just created atc:\certreq.txt (your path/filename may be different). You will need to copy-and-paste the contents of this file, including the top and bottom lines, into the relevant box during the online order process.

    Open CSR in Notepad
Source: https://www.geocerts.com/csr/iis_renew_7

Checking MD5 & SHA1 Checksums

Checking MD5 & SHA1 Checksums

Linux

Move the downloaded file and the associated MD5 hash file to a folder and execute the following command from the command line.
md5sum -c datei.md5
To check the SHA1 file, execute the following command similarly.
sha1sum -c datei.sha1
Example
benutzer:~/Ordner$ ls
datei.iso  datei.md5

benutzer:~/Ordner$ md5sum datei.iso 
161a1957728be5d530c3fab67ac40652  datei.iso

benutzer:~/Ordner$ cat datei.md5 
161a1957728be5d530c3fab67ac40652  datei.iso

benutzer:~/Ordner$ md5sum -c datei.md5 
datei.iso: OK

Windows

Move the downloaded file and the associated MD5 hash file to a folder and execute the following command using the fciv.exe.
fciv.exe -v -md5 datei.md5
To check the SHA1 file, execute the following command similarly.
fciv.exe -v -sha1 datei.md5
If the checksums agree then the downloaded file is free from errors.

Generating MD5 & SHA1 Checksums

Linux

To create an MD5 file for a file that you would like to provide for downloading, enter the following command from the command line.
md5sum datei > datei.md5
tail -c 10MB datei | md5sum > datei.md5 (If you want to calculate checksum or fingerprint based on last 10M data on the file which is OK and fast)
To create a SHA1 file, execute the following command similarly.
sha1sum datei > datei.sha1

Windows

To create an MD5 file for a file that you would like to provide for downloading, enter the following command using fciv.exe.
fciv.exe -add -md5 datei.md5
To create a SHA1 file, execute the following command similarly.
fciv.exe -add -sha1 datei.sha1

Security Instructions

MD5 is now no longer considered secure, because various file exhibiting the same MD5 checksums can be created with little effort, as has been proven.In this manner, an attacker can create an infected file, whose hash value agrees with the original checksum during the test, even though the content is different than the original file.
Progress has also be made in attacking SHA1.However, SHA1 is still not considered to have been hacked. The reason for this is that attacks on SHA1 require too much time for daily use.

Exporting/Importing VMs in Xenserver using 'xe' command

Exporting VM to xva using 'xe' command:












Importing .xva file to VM using 'xe' command:

 
#xe vm-import filename=/mnt/Export/<name of ova.xml>       \\It imports the VM into default SR (Storage Repository) and returns uuid of imported VM on success.
Note: If multiple xenserver are in a pool, it selects default shared SR. To import into SR other than default one, execute following commands:
#xe sr-list    \\To display list of Storage Repository and its details including uuid
#xe vm-import filename=/mnt/Export/<name of xva> force=true sr-uuid=<uuid of desitination SR> preserve=true   \\Preserve=true preserves the original MAC address of the VM.
Note: To check md5 checksum of exported/downloaded file before importing: http://www.thomas-krenn.com/en/wiki/Using_Md5sum_und_sha1sum_for_Checking_Downloaded_Files 

Friday, October 3, 2014

cURL Examples


cURL

cURL is a software package which consists of command line tool and a library for transferring data using URL syntax.
cURL supports various protocols like, DICT, FILE, FTP, FTPS, Gopher, HTTP, HTTPS, IMAP, IMAPS, LDAP, LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMTP, SMTPS, Telnet and TFTP.
This article provides 15 practical cURL usage examples.

1. Download a Single File

The following command will get the content of the URL and display it in the STDOUT (i.e on your terminal).
$ curl http://www.centos.org
To store the output in a file, you an redirect it as shown below. This will also display some additional download statistics.
$ curl http://www.centos.org > centos-org.html
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 27329    0 27329    0     0   104k      0 --:--:-- --:--:-- --:--:--  167k

2. Save the cURL Output to a file

We can save the result of the curl command to a file by using -o/-O options.
  • -o (lowercase o) the result will be saved in the filename provided in the command line
  • -O (uppercase O) the filename in the URL will be taken and it will be used as the filename to store the result
$ curl -o mygettext.html http://www.gnu.org/software/gettext/manual/gettext.html
Now the page gettext.html will be saved in the file named ‘mygettext.html’. You can also note that when running curl with -o option, it displays the progress meter for the download as follows.
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
 66 1215k   66  805k    0     0  33060      0  0:00:37  0:00:24  0:00:13 45900
100 1215k  100 1215k    0     0  39474      0  0:00:31  0:00:31 --:--:-- 68987
When you use curl -O (uppercase O), it will save the content in the file named ‘gettext.html’ itself in the local machine.
$ curl -O http://www.gnu.org/software/gettext/manual/gettext.html
Note: When curl has to write the data to the terminal, it disables the Progress Meter, to avoid confusion in printing. We can use ‘>’|'-o’|'-O’ options to move the result to a file.
Similar to cURL, you can also use wget to download files. Refer to wget examples to understand how to use wget effectively.

3. Fetch Multiple Files at a time

We can download multiple files in a single shot by specifying the URLs on the command line.
Syntax:
$ curl -O URL1 -O URL2
The below command will download both index.html and gettext.html and save it in the same name under the current directory.
$ curl -O http://www.gnu.org/software/gettext/manual/html_node/index.html -O http://www.gnu.org/software/gettext/manual/gettext.html
Please note that when we download multiple files from a same sever as shown above, curl will try to re-use the connection.

4. Follow HTTP Location Headers with -L option

By default CURL doesn’t follow the HTTP Location headers. It is also termed as Redirects. When a requested web page is moved to another place, then an HTTP Location header will be sent as a Response and it will have where the actual web page is located.
For example, when someone types google.com in the browser from India, it will be automatically redirected to ‘google.co.in’. This is done based on the HTTP Location header as shown below.
$ curl http://www.google.com

<TITLE>302 Moved</TITLE>
<H1>302 Moved</H1>
The document has moved
<A HREF="http://www.google.co.in/">here</A>
The above output says that the requested document is moved to ‘http://www.google.co.in/’.
We can insists curl to follow the redirection using -L option, as shown below. Now it will download the google.co.in’s html source code.
$ curl -L http://www.google.com

5. Continue/Resume a Previous Download

Using curl -C option, you can continue a download which was stopped already for some reason. This will be helpful when you download large files, and the download got interrupted.
If we say ‘-C -’, then curl will find from where to start resuming the download. We can also give an offset ‘-C <offset>’. The given offset bytes will be skipped from the beginning for the source file.
Start a big download using curl, and press Ctrl-C to stop it in between the download.
$ curl -O http://www.gnu.org/software/gettext/manual/gettext.html
##############             20.1%
Note: -# is used to display a progress bar instead of a progress meter.
Now the above download was stopped at 20.1%. Using “curl -C -”, we can continue the download from where it left off earlier. Now the download continues from 20.1%.
curl -C - -O http://www.gnu.org/software/gettext/manual/gettext.html
###############            21.1%

6. Limit the Rate of Data Transfer

You can limit the amount at which the data gets transferred using –limit-rate option. You can specify the maximum transfer rate as argument.
$ curl --limit-rate 1000B -O http://www.gnu.org/software/gettext/manual/gettext.html
The above command is limiting the data transfer to 1000 Bytes/second. curl may use higher transfer rate for short span of time. But on an average, it will come around to 1000B/second.
The following was the progress meter for the above command. You can see that the current speed is near to the 1000 Bytes.
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  1 1215k    1 13601    0     0    957      0  0:21:40  0:00:14  0:21:26   999
  1 1215k    1 14601    0     0    960      0  0:21:36  0:00:15  0:21:21   999
  1 1215k    1 15601    0     0    962      0  0:21:34  0:00:16  0:21:18   999

7. Download a file only if it is modified before/after the given time

We can get the files that are modified after a particular time using -z option in curl. This will work for both FTP & HTTP.
$ curl -z 21-Dec-11 http://www.example.com/yy.html
The above command will download the yy.html only if it is modified later than the given date and time
$ curl -z -21-Dec-11 http://www.example.com/yy.html
The above command will download the yy.html, if it is modified before than the given date and time.
Please refer ‘man curl_getdate’ for the various syntax supported for the date expression

8. Pass HTTP Authentication in cURL

Sometime, websites will require a username and password to view the content ( can be done with .htaccess file ). With the help of -u option, we can pass those credentials from cURL to the web server as shown below.
$ curl -u username:password URL
Note: By default curl uses Basic HTTP Authentication. We can specify other authentication method using –ntlm | –digest.

9. Download Files from FTP server

cURL can also be used to download files from FTP servers. If the given FTP path is a directory, by default it will list the files under the specific directory.
$ curl -u ftpuser:ftppass -O ftp://ftp_server/public_html/xss.php
The above command will download the xss.php file from the ftp server and save it in the local directory.
$ curl -u ftpuser:ftppass -O ftp://ftp_server/public_html/
Here, the given URL refers to a directory. So cURL will list all the files and directories under the given URL
If you are new to FTP/sFTP, refer ftp sftp tutorial for beginners.

10. List/Download using Ranges

cURL supports ranges to be given in the URL. When a range is given, files matching within the range will be downloaded. It will be helpful to download packages from the FTP mirror sites.
$ curl   ftp://ftp.uk.debian.org/debian/pool/main/[a-z]/
The above command will list out all the packages from a-z ranges in the terminal.

11. Upload Files to FTP Server

Curl can also be used to upload files to the FTP server with -T option.
$ curl -u ftpuser:ftppass -T myfile.txt ftp://ftp.testserver.com
The above command will upload the file named myfile.txt to the FTP server. You can also upload multiple files at a same time using the range operations.
$ curl -u ftpuser:ftppass -T "{file1,file2}" ftp://ftp.testserver.com
Optionally we can use “.” to get the input from STDIN and transfer to the remote.
$ curl -u ftpuser:ftppass -T - ftp://ftp.testserver.com/myfile_1.txt
The above command will get the input from the user from Standard Input and save the contents in the ftp server under the name ‘myfile_1.txt’.
You can provide one ‘-T’ for each URL and the pair specifies what to upload where.

12. More Information using Verbose and Trace Option

You can get to know what is happening using the -v option. -v option enable the verbose mode and it will print the details
curl -v http://google.co.in
The about command will output the following
* About to connect() to www.google.co.in port 80 (#0)
*   Trying 74.125.236.56... connected
* Connected to www.google.co.in (74.125.236.56) port 80 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.21.0 (i486-pc-linux-gnu) libcurl/7.21.0 OpenSSL/0.9.8o zlib/1.2.3.4 libidn/1.15 libssh2/1.2.6
> Host: www.google.co.in
> Accept: */*
>
* HTTP 1.0, assume close after body
< HTTP/1.0 200 OK
< Date: Tue, 10 Apr 2012 11:18:39 GMT
< Expires: -1
< Cache-Control: private, max-age=0
< Content-Type: text/html; charset=ISO-8859-1
< Set-Cookie: PREF=ID=7c497a6b15cc092d:FF=0:TM=1334056719:LM=1334056719:S=UORpBwxFmTRkbXLj; expires=Thu, 10-Apr-2014 11:18:39 GMT; path=/; domain=.google.co.in
.
.
If you need more detailed information then you can use the –trace option. The trace option will enable a full trace dump of all incoming/outgoing data to the given file
=> Send header, 169 bytes (0xa9)
0000: 47 45 54 20 2f 20 48 54 54 50 2f 31 2e 31 0d 0a GET / HTTP/1.1..
0010: 55 73 65 72 2d 41 67 65 6e 74 3a 20 63 75 72 6c User-Agent: curl
..
0060: 2e 32 2e 33 2e 34 20 6c 69 62 69 64 6e 2f 31 2e .2.3.4 libidn/1.
0070: 31 35 20 6c 69 62 73 73 68 32 2f 31 2e 32 2e 36 15 libssh2/1.2.6
0080: 0d 0a 48 6f 73 74 3a 20 77 77 77 2e 67 6f 6f 67 ..Host: www.goog
0090: 6c 65 2e 63 6f 2e 69 6e 0d 0a 41 63 63 65 70 74 le.co.in..Accept
00a0: 3a 20 2a 2f 2a 0d 0a 0d 0a                      : */*....
== Info: HTTP 1.0, assume close after body
<= Recv header, 17 bytes (0x11)
0000: 48 54 54 50 2f 31 2e 30 20 32 30 30 20 4f 4b 0d HTTP/1.0 200 OK.
0010: 0a
This verbose and trace option will come in handy when curl fails due to some reason and we don’t know why.

13. Get Definition of a Word using DICT Protocol

You can use cURL to get the definition for a word with the help of DICT protocol. We need to pass a Dictionary Server URL to it.
$ curl dict://dict.org/d:bash
The above command will list the meaning for bash as follows
151 "Bash" gcide "The Collaborative International Dictionary of English v.0.48"
Bash \Bash\, v. t. [imp. & p. p. {Bashed}; p. pr. & vb. n.
   {Bashing}.] [Perh. of imitative origin; or cf. Dan. baske to
   strike, bask a blow, Sw. basa to beat, bas a beating.]
   To strike heavily; to beat; to crush. [Prov. Eng. & Scot.]
   --Hall Caine.
   [1913 Webster]

         Bash her open with a rock.               --Kipling.
   [Webster 1913 Suppl.]
.
151 "Bash" gcide "The Collaborative International Dictionary of English v.0.48"
Bash \Bash\, n.
   1. a forceful blow, especially one that does damage to its
      target.
      [PJC]
.
.
Now you can see that it uses “The Collaborative International Dictionary of English”. There are many dictionaries are available. We can list all the dictionaries using
$ curl dict://dict.org/show:db

jargon "The Jargon File (version 4.4.7, 29 Dec 2003)"
foldoc "The Free On-line Dictionary of Computing (26 July 2010)"
easton "Easton's 1897 Bible Dictionary"
hitchcock "Hitchcock's Bible Names Dictionary (late 1800's)"
bouvier "Bouvier's Law Dictionary, Revised 6th Ed (1856)"
Now in-order to find the actual meaning of Bash in computer we can search for bash in “foldoc” dictionary as follows
$ curl dict://dict.org/d:bash:foldoc
The result will be,
bash

   Bourne Again SHell.  {GNU}'s {command interpreter} for {Unix}.
   Bash is a {Posix}-compatible {shell} with full {Bourne shell}
   syntax, and some {C shell} commands built in.  The Bourne
   Again Shell supports {Emacs}-style command-line editing, job
   control, functions, and on-line help.  Written by Brian Fox of
   {UCSB}.
For more details with regard to DICT please read RFC2229

14. Use Proxy to Download a File

We can specify cURL to use proxy to do the specific operation using -x option. We need to specify the host and port of the proxy.
$ curl -x proxysever.test.com:3128 http://google.co.in

15. Send Mail using SMTP Protocol

cURL can also be used to send mail using the SMTP protocol. You should specify the from-address, to-address, and the mailserver ip-address as shown below.
$ curl --mail-from blah@test.com --mail-rcpt foo@test.com smtp://mailserver.com
Once the above command is entered, it will wait for the user to provide the data to mail. Once you’ve composed your message, type . (period) as the last line, which will send the email immediately.
source: thegeekstuff.com