Find lines in big files
Get all lines in a big file containing some text:
cat filename | egrep "141.71.45.20|159.205.92.236|171.33.185.210|"
Thanks to Günter!
Get all lines in a big file containing some text:
cat filename | egrep "141.71.45.20|159.205.92.236|171.33.185.210|"
Thanks to Günter!
dig domainname.tld
To really see all DNS entries for the domain, add option ANY:
dig domainname.tld ANY
Fast and easy if the linenumber is known:
find . -type f -not -iname "*.jpg" -not -iname "*.jpeg" -not -iname "*.gif" -not -iname "*.png" -not -iname "*.pdf"
Useful!
Split the file into multiple files at every 3rd line . i.e, First 3 lines into F1, next 3 lines into F2 and so on:
$ awk 'NR%3==1{x="F"++i;}{print > x}' filename
Not a random node filter, but to display one of several images of a node in any list or display. Useful for several of my clients but not standard.
First of all:
Create a simple user with ssh login:
cd /var/files/test
find -type d -exec chmod 0777 {} \;
function getNumPagesInPDF($PDFPath = NULL)
{
$stream = fopen($PDFPath, "r");
$PDFContent = fread ($stream, filesize($PDFPath));
Use 'ALT+SHIFT+A' et voilá ...