Download Google Drive Files with wget or curl

I now have a Python script with many more features (eg. folder support)
You can read more about it here:

Often I find myself needing to download google drive files on a remote headless machine without a browser.

Below are the simple shell commands to do this using wget or curl.

Small file = less than 100MB
Large File = more than 100MB (more steps due to Googles 'unable to virus scan' warning)

The text in red is what needs changing for the particular file you want to download.

The fileid can be found in the google url of the file you want to download.
Set the filename to anything you like (most likely the origin files name).

Note: Make sure the file has been shared 'via link' as the script does not authenticate you.

Small File (less than 100MB)

cd ~

export fileid=1yXsJq7TTMgUVXbOnCalyupESFN-tm2nc
export filename=matthuisman.jpg

## WGET ##
wget -O $filename ''$fileid

## CURL ##
curl -L -o $filename ''$fileid

Large File (more than 100MB)

cd ~

export fileid=1sNhrr2u6n48vb5xuOe8P9pTayojQoOc_
export filename=combian.rar

## WGET ##
wget --save-cookies cookies.txt ''$fileid -O- \
     | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1/p' > confirm.txt

wget --load-cookies cookies.txt -O $filename \

## CURL ##
curl -L -c cookies.txt ''$fileid \
     | sed -rn 's/.*confirm=([0-9A-Za-z_]+).*/\1/p' > confirm.txt

curl -L -b cookies.txt -o $filename \

rm -f confirm.txt cookies.txt