Could this be a malicious hack attempt on my Nettalk webserver?

Hi Folks,

The attached screenshot shows a couple of Nettalk Webserver errors that you generally see when the browser goes looking for something it can’t find.

The location that the browser (or caller) is requesting is:

{NettalkWebFolder}\ecp\Current\exporttool\microsoft.exchange.ediscovery.exporttool.application

These errors display a lot for me when in certain environments, like if I am working in a dev environment where the physical images might not be in the location the webserver is expecting them (to show), then you see this sort of error.

Initially I wrote them off.

But on further thought, I wondered if there might be something nefarious going on.

What do you reckon?

Looks like they’re looking to do some Exchange ediscovery.

Thanks Jeff! Seems like, yeah.

I guess I need to do a bit more logging investigation to see where its coming from.

Hi Stu,

If you run your server on port 80, or port 443, you will see lots of requests per day which are clearly malicious. This is not new, and has been going on since web servers were invented.

Each of these requests is probing to find some known vulnerability in the server. In other words, in this case, there’s a known vulnerability in the Exchange server, so millions of scripts are running on the internet to find machines with this vulnerability which have not been patched yet.

You’ll see lots of these every day, targeting anything and everything - old versions of Wordpress, IIS, Apache, PHP, cgi-bin, and so on. Regardless of what server, or server-side language you use you should keep it up to date so that patches get applied as they are created. If you wait too long then a vulnerability becomes “known” and scripts will start searching for it.

At this time I’m not aware of any that are targeting NetTalk servers - and I’m not aware of any that would breach a NetTalk server even if they were created. But like everything I’m more sure of say NetTalk 12 than I am of say NetTalk 4… Staying reasonably up to date is important.

Cheers
Bruce

2 Likes

Thanks Bruce, this is a fantastic breakdown.

I see rando URL attempts all the time. So much so that for the frequent ones, I disconnect them in the webhandler proc in both the validatefilename and parserequestheader embeds. Might only be needed in one of them, but whichever is second is probably not being called given the abandon ship (ie: ReturnValue = Net:NotOK).

At the moment, this duct tape and bailing wire solution looks like this:

! we dont use PHP, CGI or cgi-bin, so this tosses out the low hanging fruit of slimeballs that try to hack us. 
 csLowerClippedFileName = LOWER(CLIP(SELF.WholeURL))
 IF    INSTRING('.php'        , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'php'
 ELSIF INSTRING('.cgi'        , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'cgi'
 ELSIF INSTRING('cgi-bin'     , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'cgi-bin'
 ELSIF INSTRING('die(@md5'    , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'die(@md5'
 ELSIF INSTRING('=die'        , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = '=die'
 ELSIF INSTRING('webdav'      , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'webdav'
 ELSIF INSTRING('.asp'        , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = '.asp'
 ELSIF INSTRING('.git'        , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = '.git'
 ELSIF INSTRING('phpstorm'    , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'phpstorm'
 ELSIF INSTRING('phpunit'     , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'phpunit' 
 ELSIF INSTRING(';chmod'      , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'chmod'
 ELSIF INSTRING(';wget'       , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'wget' 
 ELSIF INSTRING('/admin'      , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = '/admin' 
 ELSIF INSTRING('/auth'       , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = '/auth' 
 ELSIF INSTRING('/owa'        , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = '/owa' 
 ELSIF INSTRING('/jsonws'     , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = '/jsonws' 
 ELSIF INSTRING('/config'     , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = '/config' 
 ELSIF INSTRING('admin/'      , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'admin/' 
 ELSIF INSTRING('wp-content'  , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'wp-content' 
 ELSIF INSTRING('wp-includes' , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'wp-includes' 
 ELSIF INSTRING('.env'        , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = '.env' 
 ELSIF INSTRING('well-known'  , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'well-known' 
 ELSIF INSTRING('robots.txt'  , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'robots.txt' 
 ELSIF INSTRING('mstshash'    , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'mstshash' 
 ELSIF INSTRING('androxgh0st' , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'androxgh0st'  
 ELSIF INSTRING('currentsetting.htm'  , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'currentsetting.htm' 
 ELSIF INSTRING('server: akamaighost'  , csLowerClippedFileName,1,1) > 0  THEN csFailReason  = 'Server: AkamaiGHost' 
                                                             ELSE csFailReason  = ''
 END 

 IF csFailReason > ' '                                                          
    p_web.AddLog('ParseRequestHeader: Punted ' & csFailReason & ' hack attempt via ' & CLIP(SELF.WholeURL))
    p_web.Trace ('ParseRequestHeader: Punted ' & csFailReason & ' hack attempt via ' & CLIP(SELF.WholeURL))
    ReturnValue = Net:NotOK
 END
4 Likes

Cheers Mark, nice one!

1 Like

I would remove the reference to well-known (because that folder is used by the LetsEncrypt support when validating control of the domain.)

I might also remove the call to robots.txt - that’s a valid file which you can add to your web folder with instructions for web crawlers (like google). In some cases you don’t want the site searched at all, but then a simple minimal robots.txt is likely the best option.

User-agent: * Disallow: /

For more on Robots.txt see;

2 Likes

Thanks for the wisdom Bruce!

This code works fine even when using LE in NTWS (perhaps by accident). I have logged a number of attempts by “people” trying to use that in the URL so they are miscreants I want to toss.

I should have noted that this code is used on my API servers (the only NTWS servers I have), so I do not want robots.txt checked, even by legit bots. FWIW, google is known to conditionally ignore robots instructions (google it) and in certain situations (that I dont remember right this moment) they will still index the URL despite your instructions not to do so.

Kind of a sidebar… If you care about robots/indexing etc, you should be using the robots meta tags as well.

hah - I checked, and since all “.well-known/acme-challenge/” requests are routed via the BabyWebServer, they won’t get to your code, and so LE will still work. And those that aren’t acme-challenge will be discarded by your code.

So I retract my recommendation to remove the reference to well-known - it is fine (and useful) where it is.

1 Like

Although, .well-known could be beneficial for the security.txt file, but if you’re the one running the show, then maybe it’s not as big of a deal to you (if you were to send it on-premises, for instance, your customers might wish to have that location accessible). See https://securitytxt.org/

Appreciate the detail. After you mentioned that, I wondered why it was working:)

Maverick, our servers will never be on-prem, but its a good consideration to make note of.

Mark

1 Like