mail routing performance issues, backing up in /var/spool/mqueue
Denis Beauchemin
Denis.Beauchemin at USHERBROOKE.CA
Tue Jun 29 21:34:12 IST 2004
Lester,
Your top and vmstat output show no problem at all. It is quite common
for systems to use all unallocated memory for internal buffers. When
you look at memory use you may be tempted to believe that there is none
available but it is just being used for internal chores. It will be
freed upon request from user processes.
As for your outgoing queue backing up, are you sure these are not
messages that couldn't be delivered? My MS servers all have a lot of
messages in mqueue (between 2000 and 3000 messages each). This is normal.
Denis
lester lasad wrote:
>--- Ugo Bellavance <ugob at CAMO-ROUTE.COM> wrote:
>
>
>>lester lasad wrote:
>>
>>
>>
>>>fedora core 1
>>>mailscanner-4.31.6-1
>>>SpamAssassin version 2.63
>>>mta= sendmail
>>>
>>>mail is backing up in /var/spool/mqueue, it is
>>>
>>>
>>getting
>>
>>
>>>to the /var/spool/mqueue.in in a timely manner but
>>>once it gets to mqueue it is taking too long to
>>>
>>>
>>route.
>>
>>This looks like a sendmail problem, since
>>MailScanner's job is finished
>>once messages are in /var/spool/mqueue. How is your
>>outgoing sendmail
>>called?
>>
>>
>
>My MS conf settings have it set to queue. If I'm
>answering this incorrectly please let me know.
>
>
>
>>>I have been using this same setup for roughly two
>>>weeks and just started noticing the performance
>>>
>>>
>>issues
>>
>>
>>>friday afternoon.
>>>
>>>
>>Do you see anything else, swapping, high cpu usage?
>>
>>what is the output of free?
>>what is the output of
>>
>>vmstat 1
>>
>>
>
>This is output from top. hardly any memory available.
>
> 12:18:55 up 23:19, 2 users, load average: 1.61,
>2.03, 1.91
>58 processes: 57 sleeping, 1 running, 0 zombie, 0
>stopped
>CPU states: cpu user nice system irq
>softirq iowait idle
> total 13.4% 0.0% 4.2% 0.0%
>0.0% 0.0% 182.0%
> cpu00 1.3% 0.0% 0.7% 0.0%
>0.0% 0.0% 97.8%
> cpu01 12.1% 0.0% 3.5% 0.0%
>0.0% 0.0% 84.2%
>Mem: 1291356k av, 1236132k used, 55224k free,
>0k shrd, 51216k buff
>
>sorry for all the output but this is the result from
>vmstat 1
>
>vmstat 1
>procs memory swap
>io system cpu
> r b swpd free buff cache si so bi
>bo in cs us sy wa id
> 0 0 29716 52092 51224 43008 0 0 2
>211 109 84 18 5 0 77
> 0 0 29716 36940 51224 43012 0 0 0
>0 171 104 18 8 0 73
> 0 0 29716 42432 51224 43016 0 0 0
>0 170 123 11 4 0 85
> 1 0 29716 31508 51224 43032 0 0 0
>884 284 141 37 11 0 52
> 1 0 29716 53780 51224 43028 0 0 0
>516 158 170 44 11 0 45
> 0 0 29716 53796 51224 43004 0 0 0
>304 129 49 1 0 0 99
> 0 0 29716 53596 51224 43004 0 0 0
>36 164 70 0 1 0 98
> 0 0 29716 52792 51224 43012 0 0 0
>208 181 109 0 0 0 99
> 0 0 29716 52752 51224 43028 0 0 0
>384 170 130 0 1 0 99
> 0 0 29716 43948 51224 43040 0 0 0
>76 172 95 10 3 0 87
> 0 0 29716 52212 51224 43040 0 0 0
>128 151 97 7 2 0 91
> 0 0 29716 43976 51224 43044 0 0 0
>0 168 91 6 3 0 91
> 0 0 29716 43524 51224 43088 0 0 0
>0 264 239 16 5 0 78
> 1 2 29716 43580 51224 45748 0 0 0
>3256 266 227 25 12 0 63
> 0 0 29716 43648 51224 43148 0 0 0
>452 204 210 26 5 0 69
> 0 0 29716 41032 51224 43196 0 0 0
>892 253 186 3 3 0 94
> 0 3 29716 43004 51224 43244 0 0 0
>4536 285 333 43 13 0 44
> 3 2 29716 33708 51224 43256 0 0 0
>5356 337 371 41 12 0 47
> 0 0 29716 41336 51232 43160 0 0 0
>4328 230 218 39 12 0 49
> 0 0 29716 41152 51232 43160 0 0 0
>0 137 40 0 0 0 99
> 0 0 29716 41732 51232 43172 0 0 0
>376 156 90 0 0 0 99
> 0 0 29716 41588 51232 43172 0 0 0
>24 139 33 0 0 0 100
>procs memory swap
>io system cpu
> r b swpd free buff cache si so bi
>bo in cs us sy wa id
> 1 0 29716 41932 51232 43192 0 0 0
>472 179 161 0 2 0 98
> 1 0 29716 36200 51232 43212 0 0 0
>80 173 131 12 5 0 82
> 0 0 29716 48408 51232 43220 0 0 0
>3700 259 266 36 6 0 57
> 2 0 29716 30084 51232 43244 0 0 0
>20 194 146 29 9 0 62
> 4 0 29716 30568 51232 45840 0 0 0
>296 170 177 40 11 0 49
> 1 0 29716 45340 51232 45824 0 0 0
>664 157 172 38 10 0 52
> 0 0 29716 49760 51232 43212 0 0 0
>320 132 66 22 2 0 76
> 0 0 29716 50172 51232 43212 0 0 0
>0 128 29 0 0 0 99
> 0 0 29716 50172 51232 43212 0 0 0
>148 147 15 0 0 0 100
> 0 0 29716 49960 51232 43212 0 0 0
>0 122 13 0 0 0 100
> 0 0 29716 49784 51232 43212 0 0 0
>0 135 41 0 1 0 99
> 0 0 29716 49784 51232 43212 0 0 0
>0 114 9 0 0 0 100
> 0 0 29716 49632 51232 43228 0 0 0
>300 152 108 0 1 0 99
> 1 0 29716 44536 51232 43228 0 0 0
>140 205 111 1 2 0 96
> 1 0 29716 37864 51232 43244 0 0 0
>228 204 172 9 3 0 87
> 0 0 29660 38948 51232 43264 0 0 0
>4656 288 293 39 17 0 45
> 1 0 29660 38528 51232 45000 0 0 0
>376 238 221 24 7 0 69
> 2 0 29660 48644 51232 43324 0 0 0
>444 226 261 30 6 0 63
> 0 0 29660 46856 51232 43364 0 0 0
>4492 249 228 31 8 0 61
> 0 0 29660 47380 51232 43384 0 0 0
>220 164 108 0 0 0 100
> 0 0 29660 47956 51232 43384 0 0 0
>0 120 14 0 0 0 100
> 2 0 29660 28936 51232 43444 0 0 0
>316 208 213 20 8 0 72
>procs memory swap
>io system cpu
> r b swpd free buff cache si so bi
>bo in cs us sy wa id
> 1 0 29660 30768 51232 43420 0 0 0
>0 182 167 19 5 0 76
> 4 0 29660 31056 51236 43516 0 0 0
>3636 274 345 37 11 0 52
> 0 3 29660 36784 51244 43512 0 0 0
>4796 284 339 50 14 0 36
> 1 0 29660 32300 51244 46008 0 0 0
>1116 269 430 26 9 0 65
> 0 0 29660 36200 51244 43448 0 0 0
>1148 289 310 23 6 0 71
> 0 0 29660 26932 51244 43456 0 0 0
>104 172 108 10 3 0 86
> 1 2 29660 36652 51244 43464 0 0 0
>592 189 85 12 4 0 84
> 1 0 29660 31236 51244 43448 0 0 0
>236 143 82 37 4 0 58
> 1 0 29660 24948 51244 43452 0 0 0
>0 153 79 11 1 0 88
> 1 0 29660 43564 51244 46408 0 0 0
>1016 190 242 43 16 0 41
> 1 0 29660 44900 51244 43452 0 0 0
>256 161 76 23 5 0 72
> 0 0 29660 36860 51244 43460 0 0 0
>20 186 128 7 3 0 90
> 0 3 29660 46244 51244 43480 0 0 0
>940 219 278 32 10 0 58
> 0 0 29660 37776 51244 43484 0 0 0
>72 177 125 7 3 0 90
> 1 0 29660 33804 51244 43496 0 0 0
>4660 197 189 30 8 0 61
> 0 0 29660 37660 51244 43492 0 0 0
>196 163 125 14 4 0 81
> 2 3 29660 34836 51244 44856 0 0 0
>772 264 251 20 9 0 71
> 1 0 29660 32336 51244 43504 0 0 0
>7896 227 189 53 10 0 37
> 0 0 29660 36308 51244 43500 0 0 0
>168 171 110 14 4 0 82
> 1 0 29660 32696 51244 43516 0 0 0
>3572 198 168 31 7 0 62
> 0 1 29660 35764 51244 43524 0 0 0
>268 208 191 17 5 0 78
> 1 0 29660 34016 51244 43536 0 0 0
>616 176 182 24 7 0 68
>procs memory swap
>io system cpu
> r b swpd free buff cache si so bi
>bo in cs us sy wa id
> 2 3 29660 30468 51244 43564 0 0 0
>760 328 362 30 6 0 64
> 1 3 29660 28920 51244 46180 0 0 0
>2160 252 309 41 16 0 43
> 2 0 29660 36512 51244 46452 0 0 0
>2984 220 219 38 11 0 50
> 0 1 29660 33064 51244 43592 0 0 0
>488 171 81 34 6 0 60
> 0 0 29660 33924 51244 43584 0 0 0
>0 171 96 13 5 0 81
> 1 2 29660 41464 51244 44836 0 0 0
>352 140 103 11 10 0 78
> 0 0 29660 43756 51244 43580 0 0 0
>312 135 53 25 4 0 71
> 0 0 29660 43348 51244 43580 0 0 0
>0 127 31 0 0 0 100
> 0 0 29660 43348 51244 43580 0 0 0
>28 116 21 0 0 0 100
> 0 0 29660 43348 51244 43580 0 0 0
>0 122 10 0 0 0 100
> 1 0 29660 38712 51244 43580 0 0 0
>0 140 56 2 0 0 97
> 3 0 29660 27732 51244 43600 0 0 0
>400 263 285 13 8 0 79
> 2 0 29660 21532 51244 43628 0 0 0
>5040 355 336 37 7 0 56
> 1 0 29660 32884 51244 45144 0 0 0
>1888 462 422 34 9 0 57
> 1 0 29660 38832 51244 43656 0 0 0
>380 170 136 36 8 0 56
> 0 0 29660 24936 51244 43672 0 0 0
>0 174 103 25 4 0 71
> 0 0 29660 34176 51244 43660 0 0 0
>0 156 84 6 3 0 91
> 3 0 29660 30560 51244 46268 0 0 0
>344 205 190 22 13 0 65
> 0 5 29656 30784 51260 46304 0 0 0
>3916 282 228 50 15 0 35
> 2 0 29656 38740 51260 43720 0 0 0
>652 221 212 18 4 0 78
> 0 2 29656 36584 51260 43736 0 0 0
>356 189 134 1 1 0 98
> 0 0 29656 35996 51260 43732 0 0 0
>4 167 96 2 2 0 96
>procs memory swap
>io system cpu
> r b swpd free buff cache si so bi
>bo in cs us sy wa id
> 1 0 29656 27908 51260 43748 0 0 0
>444 196 163 10 4 0 86
> 0 2 29656 23776 51260 43760 0 0 0
>720 272 138 6 3 0 90
> 1 4 29656 23148 51260 43784 0 0 0
>464 303 356 13 4 0 83
> 2 0 29652 39768 51260 46200 0 0 0
>1436 405 310 20 13 0 67
> 2 0 29652 29552 51260 43796 0 0 0
>344 178 123 32 10 0 57
>open: No such file or directory
>error: failed to parse /proc/stats
>
>
>
>
>>?
>>
>>
>>>I am not finding anything in the maillog that
>>>
>>>
>>would
>>
>>
>>>lead me to believe that there is a problem. The
>>>
>>>
>>only
>>
>>
>>>steps I have taken so far was to try using
>>>
>>>
>>different
>>
>>
>>>dns servers and I also turned on skip_rbl_checks
>>>
>>>
>>in
>>
>>
>>>/etc/MailScanner/spam.spamassassin.prefs.conf
>>>
>>>
>>do you have a local caching nameserver?
>>
>>
>
>No...
>
>
>
>>>If I restart MailScanner it seems to route quickly
>>>
>>>
>>but
>>
>>
>>>only for a couple of minutes then it starts
>>>
>>>
>>backing up
>>
>>
>>>again. I usually have anywhere from 200 to 500
>>>messages in mqueue at one time since this problem
>>>started surfacing.
>>>
>>>
>>And how many did you have before?
>>
>>
>
>Mail was routed as quickly as it came in, I would say
>no more that 10 - 20 in the queue at a time.
>
>
>
>>>Any other suggestions would be greatly
>>>
>>>
>>appreciated.
>>
>>
>>>My box routes roughly 40,000 - 50,000 messages per
>>>
>>>
>>day.
>>
>>-------------------------- MailScanner list
>>----------------------
>>To leave, send leave mailscanner to
>>jiscmail at jiscmail.ac.uk
>>Before posting, please see the Most Asked Questions
>>at
>>http://www.mailscanner.biz/maq/ and the archives
>>at
>>http://www.jiscmail.ac.uk/lists/mailscanner.html
>>
>>
>>
>
>
>
>
>__________________________________
>Do you Yahoo!?
>New and Improved Yahoo! Mail - Send 10MB messages!
>http://promotions.yahoo.com/new_mail
>
>-------------------------- MailScanner list ----------------------
>To leave, send leave mailscanner to jiscmail at jiscmail.ac.uk
>Before posting, please see the Most Asked Questions at
>http://www.mailscanner.biz/maq/ and the archives at
>http://www.jiscmail.ac.uk/lists/mailscanner.html
>
>
>
--
_
°v° Denis Beauchemin, analyste
/(_)\ Université de Sherbrooke, S.T.I.
^ ^ T: 819.821.8000x2252 F: 819.821.8045
-------------------------- MailScanner list ----------------------
To leave, send leave mailscanner to jiscmail at jiscmail.ac.uk
Before posting, please see the Most Asked Questions at
http://www.mailscanner.biz/maq/ and the archives at
http://www.jiscmail.ac.uk/lists/mailscanner.html
More information about the MailScanner
mailing list