working azure logging setup

这一生的挚爱 提交于 2019-12-04 00:24:28

We're using Log4Net on Azure as well

Minor warning, when running multiple roles per instance, I've only been able to get one role [main role] to actually successfully write logs... [not good!!!]

How to set it up... Easy as.

in Global.asax as per usual configure log4net

protected void Application_Start()
{
    log4net.Config.XmlConfigurator.Configure();
}

in your role entry point... run the configuration as well. F#*^ knows why, but if you don't do it in both places it won't work (well not in my case anyway)

public override void Run()
{
    log4net.Config.XmlConfigurator.Configure();
}

Then in your config files

<log4net>
    <appender name="TraceAppender" type="log4net.Appender.TraceAppender">
        <layout type="log4net.Layout.PatternLayout">
            <conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] - %message%newline" />
        </layout>
    </appender>
    <root>
        <level value="ALL" />
        <appender-ref ref="TraceAppender" />
    </root>
</log4net>

Make sure you have Azure tracing enabled

<system.diagnostics>
     <trace>
          <listeners>
              <add type="Microsoft.WindowsAzure.Diagnostics.DiagnosticMonitorTraceListener, Microsoft.WindowsAzure.Diagnostics, Version=1.7.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" name="AzureDiagnostics">
                  <filter type="" />
             </add>
          </listeners>
     </trace>

We opted not to use Storage Studio reason being, any data coming out of Azure you pay for... data transactions within Azure is free, hooking into Table storage is easy as pie, so we built a screen to show logs, took 2/3 hours works like a bomb

We're using log4net quiet successfully with Azure, however we are dumping data into Azure Table Storage. It is definitely a more scale-able solution. I highly recommend that you purchase a license for Cerebrata's (now RedGates) Storage Studio to ease the pain of dealing with Azure Table Storage and their Diagnostics Manager to ease the pain of looking at trace logs.

For scenarios where there are large amounts of log data, we have tried directed the data into Blobs - partly using the using EtwTraceListener, as well as send critical pieces of information that we need to act on to Table Storage. We use both the AzureStorageTraceListener as well as created a custom TraceListener that directs the data to our own Azure Table - because the OOB schema did not meet our requirements. Because WADLogsTable can grow quickly and we need a way to trim it without incurring a huge Storage Txn cost or taking Tracing offline while we finish deleting/recreating the Table, we use our custom table which creates a different table for each month.

Ranjith
http://www/opstera.com

I use custom logging to write log data out to Azure table storage for all my sites. However, like you rightly point out, retrieving the data from the tables is the awkward bit. Therefore I also have a local internal-use only website running on our premises which retrieves the data from the Azure tables and stores it in a local database. Although you do pay to transfer data out of the Azure system, the costs are miniscule. Our logging solution for all our websites costs us less than £1 per month.

Having the data moved into a local database means that it is infintely more queryable than keeping it in the Azure system. I maintain a discussion on Azure logging which discusses a lot of the issues above and also provides links out to code examples.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!