4:2:0 to 4:2:2 compression
Hi all,
Everybody knows that the 4:2:2 compression is much more better for working with thw colour correction.
I take my footages with my Sony FX1 whitch have a compression rate 4:2:0.
So the question is: is usefull to export my edited timelines as 10 (or 8) bit uncompressed 4:2:2, and in a second moment work this files with colour, or this process have no effects in the final result of the colour correction quality???
Thanx
And I think I chose a poor way to respond. Make no mistake, I'm not arguing against your point. You're right, capturing/using to a better codec will not add more information to the image. What it will do is afford you more latitude in effects/edit.
Again, using your example, VHS will still be VHS. But capturing VHS and editing and color grading in FCP will yield far better results than running a tape-tape editing system with color filters in between. Am I making sense? I guess i'm just trying to say that by using a better codec, you're not degrading the image again every time you apply filters and grade the image. That is why Color (and many other options) use 10-bit precision even when working with 8-bit images. No, there will not suddenly be more information to work with in the image, but you're not throwing out information either.
Oh, and sorry for the threadjack. We should probably let this thread go back to the question at hand...
Sending the files to Color as is should be okay (though as Zak mentioned, Color does play nicely with ProRes). But when sending the files from Color to FCP, it is better to use ProRes or some other high quality codec so as to not further* reduce the image quality.
*JP- I guess that was what I was trying to get across. It doesn't make the image better, but it definitely doesn't make the image worse, which might happen if one stays in the 4:2:0 color sampling space.
Message was edited by: Paul Conigliaro
Similar Messages
-
"get all new data request by request" after compressing source Cube
Hi
I need to transfer data from one Infocube to another and use the Delta request by request.
I have tried this when data on Source Infocube was not compressed and it worked.
Afterwards some requests were compressed and after that the delta request by request is transfering all the information to target Infocube in only one request.
Do you know if this a normal behavior?
Thanks in advanceHi
The objective of compression is it will delete all the request in your F table and moves data to E table.after compression you don't have request by request by data.
This is the reason you are getting all the data in single request.
Get data request by request works, if you don't compress the data in your Cube.
If you want to know about compression, check the below one
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035d300-b477-2d10-0c92-f858f7f1b575?QuickLink=index&overridelayout=true
Regards,
Venkatesh. -
PDF File Size - any way to compress further?
We have are using the Crystal for .NET export method to export a report to a .pdf file, and are having an issue with the pdf file size on a report that contains images. The images are stored in a SQL2005 database as blob or varbinary(max). The report executes a stored procedure that selects data (including images) to produce a quotation. There are input parms to decide which images to print (for example A, B or both A&B). Both types of images can appear at the line level on the quotation. Type A images print in the main report, Type B images are in a subreport. Both the main report & subreport execute a stored procedure to select an image type.
Our issue, when both Type A and B image is selected to print on a 293 line quote. The PDF file size is 44.23MB
When Type A only is selected, the PDF file size is 2.64MB
When Type B only is selected, the PFD file size is 43.95MB.
There are more Type B images that would print at the line level than Type A, but is there any way to compress this down further as it is too large to email.You mention that you are using Crystal for .NET, but not what version; CR for .NET 2003, 2005, 2008. 2010?
Applying the latest fixes for the correct version of CR would be the first thing to do.
Next, I'd have a close look at image B as it appears to be the one adding the most "bulk". How is this image different from image A?
General tips re. images:
Save the files a BMPs.
When .jpg image is inserted into Crystal Reports it is converted into bitmap format and as the result of the conversion the report may loose some quality, scale and other issues may arise... E.g.; Crystal Reports is not so good a graphics management tool
When an image is saved as a .bmp, then there is no conversion required and thus the quality of the image should be preserved.
Best practice: save the image with high resolution and required size as a .bmp format then insert this image into Crystal Reports.
Resize the image to the smallest possible size and downgrade the DPI to 72. This will ensure your image is as small as possible and Crystal Reports will have to put in the least amount of work to display it.
Have a look ar KB [1241630 - Exporting a Crystal report (XI) to PDF generates a big PDF file|http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes_boj/sdn_oss_boj_bi/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/scn_bosap/notes%7B6163636573733d36393736354636443646363436353344333933393338323636393736354637333631373036453646373436353733354636453735364436323635373233443330333033303331333233343331333633333330%7D.do]. This KB, may apply to your version of CR, or not. In any case, you will have to use the KB as a guide and determine what the appropriate registry entry would be for your version of CR.
One more thing. Many people like to use jpg files as they are smaller than bmp files. However, as far as Crystal Reports is concerned, this is inconsequential. The report file will be the same size if a file is inserted as a jpg or a bmp. This is due to the jpg conversion to bmp Crystal Reports does internally.
Ludek
Follow us on Twitter http://twitter.com/SAPCRNetSup
Got Enhancement ideas? Try the [SAP Idea Place|https://ideas.sap.com/community/products_and_solutions/crystalreports] -
Hello All...
Back after a brief absence, things look a little bit different.
I'm trying to take a 16 minute mini dv video and compress it for use on the web. I'm interested in any suggestions you may have on settings for the video and audio tracks. I've tried using Sorenson 3 (15 frames, key frames set to automatic, 320 x 240) for video and IMA 4:1 (mono) for audio. The resulting video looked great but the file size came in at about 255 Mb.
Thanks!
PowerMac G5 1.8 Dual Mac OS X (10.4.3)
Message was edited by: Dan FoleyThank you for the replies. Everyone was correct about the jack, interface, and phasing problems. I have been unplugging my motu audio interface and then using headphones at work. I have not changed any detailed audio output settings in logic. When I read that the jack might be a problem I tried switching headphones. This actually helped. I am using dre-beats headphones and they seem to be having issues with the mac/jack-(the phasing/panning problems. I can use these headphones with other devices but not the mac. I have to use ipod ear buds and the phasing seems fixed. Hopefully this information is helpful to someone else.
If anyone knows how to correct this issue please let me know its difficult to know what my final mixes are going to sound like and I have had to keep bouncing everything into i-tunes- sync to ipod and then listen in my car radio. -
Problem in Compresses file.Not able to see the data in file
Hi All,
I am downloading file into application server.
Using:
OPEN DATASET file FOR OUTPUT FILTER 'compress'.
OR
DATA: l_command(100) type c.
CONCATENATE 'compress ' p_file INTO lw_command SEPARATED BY space.
ENDIF.
CALL 'SYSTEM' ID 'COMMAND' FIELD lw_command.
When I see the file content it contains junk characters.
If I want to see the data Do I need to uncompress the file.
con't we see the data without uncompressing the file?.
Regards,
vinodHi,
Uncompress is required to read the content of an Compressed file.
OPEN DATASET DSN FOR OUTPUT FILTER 'compress'.
OPEN DATASET DSN FOR INPUT FILTER 'uncompress'.
The first OPEN statement opens the file so that the displayed data can be read into it in compressed form.
The second OPEN statement opens the file so that the data is decompressed when it is read from the file.
http://www.sapnet.ru/abap_docu/ABAPOPEN_DATASET.htm#&ABAP_ADDITION_8@14@ -
No sound and am unable to compress
I have a Sony Handycam and have noticed that when I download my videos to my desktop there is no sound. I downloaded MPEG Streamclip and it pretty much took care of the problem, but it has also created new issues.
A. Once I convert the video through MPEG I am not able to put the movie into imovie, whereas if I don't convert it (leave it w/out sound) then I can.
B. If I were to try to import clips directly from my Sony Handycam the program says I still need to plug in my camera (even though it is). It's just not reading it. I don't see a way to set it to "VCR" as I've seen demonstrated in the demo.
I am ultimately trying to compress some videos to post on youtube and also would like to burn dvd's. But it seems that beacause I can't ge through imovie I am also not able to work w/the videos in idvd.
Also, is there a way to convert .mpg's from my sony digital camera (I took some small videos w/it. This isn't my video cam)? When I pulled those videos off the camera they were mpg's and I'm not seeing a way to convert them.
Any advice would be appreciated. I've been working on this for about 3 months and have made absolutely no progess. *pulls out hair*
Thanks!
nicolletteSorry for all the edits, I'll just tack on the new question here.
With SC open to "export" my setting choices are as follows:
Compression - set to: DV (DV25)
Standard - set to: NTSC, 720x480,29.97fps with PAL option in drop-down box. I'm think PAL is for non-US DVD's?
*Aspect Ratio* - set to: 4:3 with a 16:9 option in drop-down box.
*I also have an option to "check" any of the following:*
Frame Blending
Better Downscaling (currently checked)
Deinterlace Video
Resample Audio to 48KHZ (currently checked)
Split DV stream in Segment (use this option to prepare the stream for imovie) currently not checked but I now realize it needs to be.
*"Deselect (the following) for progressive movies -*
Interlaced Scaling (currently checked)
Reinterlace Chroma (currently checked)
Rotation - set to: No
Zoom: 100% X/Y: 1 Center: 0,0
Cropping (currently unchecked)
Presets
*Reset All*
Adjustments...
I'm not sure what most of these settings do/affect. Is there something online that explains each of these steps? To export an mpg1 for use to burn as a dvd (just copy directly from desktop, not using a program) or for use in imovie/idvd I'm not sure where the settings need to be before I export. -
How to enable SOAP compression in web services
Hi, I have created a web services by exposing an EJB using annotations. Trouble now is that the output is an array of Objects which grew to a significant size to cause latency issue. I have saw folks using filters to compress the HTTP response but is unable to find much resources on the web linking up annotated-EJB web services to link up with the filters but most of the examples I see are based on handlers and not EJB-exposed web services, can someone point me to some resources for that type of web services + compression?
Another thing which caught my attention is the use of the <compress-html-template> tag which needs to be in the web.xml file, but since I am using a "generated" web services as per say, I do not have that web.xml file, how then can I use that feature?
Thanks in advance.Hi Anke,
all tables that have been created in V9.7 with attribute COMPRESS YES will be compressed statically .
db2 " select count(*) , rowcompmode from syscat.tables group by rowcompmode "
After the upgrade to 10.5 all tables created with attribute COMPRESS YES will get rowcompmode='A' but old tables created with V9.7 will stay with rowcompmode='S' .
You can change tables from rowcompmode='S' to rowcompmode='A' via ALTER TABLE . After this all new pages or old pages that are touched will be adaptively compressed. But old pages that are not touched will only be static compressed. To get all pages of an existinbg table adaptive compressed you need to move data. For example with DB6CONV.
Regards
Frank -
Step by Step instructions needed to compress a 39KB file into a 20MB PDF
Newbie needs help.
I have a 39KB AI file I need to compress into a 20MB PFD while retaining all image quality for print. Thank you so much!A 20MB file is pretty big. Are you sure that you don't mean a 20KB file. That's where everyone is confused about what you are asking. Saving your file from AI as a PDF and choosing Smallest File Size in the Second dialog will probably give you the smallest file you can get.
-
Coalesce or compress this index? what is the best solution in this case?
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64biI have executed the following query on a specific index that I suspected to be smashed and got the following result
select
keys_per_leaf, count(*) blocks
from (
select sys_op_lbid (154813, 'L', jus.rowid) block_id,
count (*) keys_per_leaf
from xxx_table jus
where jus.id is not null
or jus.dat is not null
group by sys_op_lbid (154813, 'L', jus.rowid)
group by keys_per_leaf
order by keys_per_leaf;
keys_per_leaf blocks
1 80
2 1108
3 2816
4 3444
5 3512
6 2891
7 2579
8 2154
9 1943
10 1287
11 1222
12 1011
13 822
14 711
15 544
16 508
17 414
18 455
19 425
20 417
21 338
22 337
23 327
24 288
25 267
26 295
27 281
28 266
29 249
30 255
31 237
32 259
33 257
34 232
35 211
36 209
37 204
38 216
39 189
40 194
41 187
42 200
43 183
44 167
45 186
46 179
47 179
48 179
49 171
50 164
51 174
52 157
53 181
54 192
55 178
56 162
57 155
58 160
59 153
60 151
61 133
62 177
63 156
64 167
65 162
66 171
67 154
68 162
69 163
70 153
71 189
72 166
73 164
74 142
75 177
76 148
77 161
78 164
79 133
80 158
81 176
82 189
83 347
84 369
85 239
86 239
87 224
88 227
89 214
90 190
91 230
92 229
93 377
94 276
95 196
96 218
97 217
98 227
99 230
100 251
101 266
102 298
103 276
104 288
105 638
106 1134
107 1152
229 1
230 1 This is a 5 columns unique key index on (id number, dat date, id2 number, dat2 date type number).
Furthermore, a space analysis of this index using dbms_space.space_usage gives the following picture
Number of blocks with at least 0 to 25% free space = 0 -------> total bytes = 0
Number of blocks with at least 25-50% free space = 75 -------> total bytes = ,5859375
Number of Blocks with with at least 50 to 75% free space = 0 -------> Total Bytes = 0
number of blocks with at least 75 to 100% free space = 0 -------> total bytes = 0
Number of full blocks with no free space = 99848 -------> total bytes = 780,0625
Total blocks ______________________________
99923
Total size MB______________________________
799,384It seems for me that this index needs to be either coalesced or compressed.
Then, what would be the best option in your opinion?
Thanks in advance
Mohamed Houri
Edited by: Mohamed Houri on 12-janv.-2011 1:18So let me continue my case
I first compressed the index as follows
alter index my_index rebuild compress 2;which immediately presents two new situations
(a) index space
Number of blocks with at least 0 to 25% free space = 0 -------> total bytes = 0
Number of blocks with at least 25-50% free space = 40 -------> total bytes =, 3125
Number of Blocks with at least 50 to 75% free space = 0 -------> total Bytes = 0
Number of blocks with at least 75 to 100% free space = 0 -------> total bytes = 0
Number of full blocks with no free space = 32361 -------> total bytes = 252, 8203125
Total blocks ______________________________
32401
Total size Mb______________________________
259,208meaning that the compress command freed up 67487 leaf blocks and reduced the size of the index from to 799,384 MB to 259,208 MB.
It also shows a relative nice pictue of number of keys per leaf block (when compared to the previous situation)
(b) on the number of key per leaf block
KEYS_PER_LEAF BLOCKS
4 1
6 1
13 1
15 1
25 1
62 1
63 1
88 1
97 1
122 1
123 3
124 6
125 4
126 2
289 4489
290 3887
291 3129
292 2273
293 1528
294 913
295 442
296 152
297 50
298 7
299 1 In a second step, I have coalesced the index as follows
alter index my_index coalesce;which produces the new figure
Number of blocks with at least 0 to 25% free space = 0 -------> total bytes = 0
Number of blocks with at least 25-50% free space = 298 -------> total bytes = 2,328125
Number of Blocks with at least 50 to 75% free space = 0 -------> Total Bytes = 0
Number of blocks with at least 75 to 100% free space = 0 -------> total bytes = 0
Number of full blocks with no free space = 32375 -------> total bytes = 252, 9296875
Total blocks ______________________________
32673
Total size MB______________________________
261,384meaning the the coalesce command has made
(a) 298-40 = 258 new blocks with 25-50% of free space
(b) 32375-32361 = 14 new additional blocks which have been made full
(c) The size of the index increased by 2,176MB (261,384-259,208)
While the number of key per leaf block keeps in the same situation
KEYS_PER_LEAF BLOCKS
4 2
5 3
9 1
10 2
12 1
13 1
19 1
31 1
37 1
61 1
63 1
73 1
85 1
88 1
122 1
123 4
124 4
125 3
126 1
289 4492
290 3887
291 3125
292 2273
293 1525
294 913
295 441
296 152
297 50
298 7
299 1 Could you please through some light on the difference between the compress and the coalesce on the effect they have made on
(a) the number of keys per leaf blocks within my index
(b) the space and size of my index?
Best regards
Mohamed Houri -
Effect of Cube Compression on BIA index's
What effect does cube compression have on a BIA index?
Also does SAP recommend rebuilding indexes on some periodic basis and also can we automate index deletes and rebuild processes for a specific cube using the standard process chain variants or programs?
Thank you<b>Compression:</b> DB statistics and DB indexes for the InfoCubes are less relevant once you use the BI Accelerator.
In the standard case, you could even completely forgo these processes. But please note the following aspects:
Compression is still necessary for inventory InfoCubes, for InfoCubes with a significant number of cancellation requests (i.e. high compression rate), and for InfoCubes with a high number of partitions in the F-table. Note that compression requires DB statistics and DB indexes (P-index).
DB statistics and DB indexes are not used for reporting on BIA-enabled InfoCubes. However for roll-up and change run, we recommend the P-index (package) on the F-fact table.
Furthermore: up-to-date DB statistics and (some) DB indexes are necessary in the following cases:
a)data mart (for mass data extraction, BIA is not used)
b)real-time InfoProvider (with most-recent queries)
Note also that you need compressed and indexed InfoCubes with up-to-date statistics whenever you switch off the BI accelerator index.
Hope it Helps
Chetan
@CP.. -
Problems with compressing files with right hand click. it does not work.
Problems with compressing files with right hand click.
I am using the compress function in the Mac OS (File > Compress XX) from time to time. Today it does not work anymore. OS 10.5.6
I get a message: The content list cannot be created for compressing.
I tried it with files and folders and keep getting this message. Anybody any idea as to how to fix thisThanks I love my macbook!!!!
I also got further problems such as copy-paste not working etc.
so I fixed it just this morning by running Applejack 1.5 and am back up running noticing your post.
thanks for helping though! -
Anyone else having trouble opening CMYK PDFs from Photoshop CC with JPEG 'high' compression?
(Cross-post from the Photoshop forum, where I was asked to post in the Acrobat forum: http://forums.adobe.com/message/5553829 )
When I save a PDF file in Photoshop CC in CMYK color with JPEG compression (quality = high), it will save without any errors. However, when I open the file in Acrobat (11.0.3), I get an "Out of Memory" pop up error. I press Ok, and the document is just displayed as a blank white image.
I'm using Win7 64-bit, Photoshop CC 64-bit. Plenty of memory (24GB) and allocated RAM in preferences, and I've done some troubleshooting that leads me to believe the error is not due to system memory, but rather a specific bug in the software. I have Photoshop CS6 on the same machine, and if I open the PSD, save as PDF with the above settings, everything is fine when I open it using Acrobat. This seems to just be for Photoshop CC, not CS6.
Here is the troubleshooting I've done so far:
PDF format, RGB color, any other settings --> Acrobat can open the file.
PDF format, CMYK, no compression --> Acrobat can open the file.
PDF format, CMYK, JPEG compression [medium] --> Acrobat can open the file.
PDF format, CMYK, JPEG compression [high] --> Acrobat can NOT open the file because it encounters the "out of memory" error.
PDF format, CMYK, JPEG compression [maximum] --> Acrobat can open the file.
This is silly. I don't get it. Can anyone else confirm this problem for me?
Updated:
Like you said, there isn't enough information to rule out an issue with Photoshop CC, as the same action using Photoshop CS6 results in a usable file.
New information (while making files for uploading):
The error only occurs with CMYK files of larger dimensions. On a 500x500px image, Acrobat throws out "A drawing error has occured". a 2500x2500px image runs into the "Out of memory" error. In either case, Acrobat cannot successfully open the file generated by Photoshop CC, while it can if it was made using Photoshop CS6.
RGB PDF JPEG HIGH - PsCC - No Error.pdf
https://drive.google.com/uc?export=&confirm=no_antivirus&id=0B9lLW1Ml0 0iQc2ltUXhqVTRVaWs
CMYK PDF JPEG HIGH - 500x500px - PsCC - Acrobat drawing error.pdf
https://drive.google.com/uc?export=&confirm=no_antivirus&id=0B9lLW1Ml0 0iQZ3M2SS1PbEJuN0k
CMYK PDF JPEG HIGH - 2500x2500px - PsCC - Acrobat out of memory error.pdf
https://drive.google.com/uc?export=&confirm=no_antivirus&id=0B9lLW1Ml0 0iQd2dxWm9xWjk3bEEAnother observation for what it is worth (not a solution again). I was able to open all 3 PDFs in Irfanview as graphics. The properties of the image were
The major changes for the others within the JPegs was the number of pixels (133X133) and the size (1.4inX1.4in). There is also a reduction in the number of colors.
Just for information, I opened the big JPeg in Acrobat (9.5.5) and it opened fine. However, it looks like the color space was changed as shown below:
Even if I change the input to use CMYK and not RGB, it looks like it gets converted to RGB. I haven't spent time to figure out the preflight analysis properly, but at least this may give you some info back. -
Scrolling text is jumpy/pulsing when compressed
I'm making end credits using the scroll up text animation behavior in Motion 2.1.2. Whenever I compress it for DVD Studio Pro and insert it the text looks terrible. It is pulsing in and out and is hardly readable. I found a few other people who have had this problem, but no solutions that have helped.
I have done credits using the scrolling text in Final Cut without much problem, but for this project I need motion in order to use text of different sizes.
Here are my specs:
Project Properties - NTSC DV; 720x480; Pixel Aspect Ratio: NTSC D1/DV - 0.90; Field Order: Lower First (Even); Frame Rate: 29.97; Background color is black (0%)
Render Settings - Motion Blur Samples: 8; Shutter Angle: 360; Output Antialiasing Method: Best
I used Export using Compressor: DVD Best Quality 90 minutes 4:3; I tried using the Color+Alpha and just Color in the Output options. Premultiply alpha and Use field render are both checked. Use motion blur is not checked.
Most of the text is Geneva Regular 14 pt font. I have tried completely white as well as setting the RGB sliders all at 235 with no difference. I also tried using a black outline with no difference. The scroll up behavior has a rate of 52.
Please help! It took me forever to put these credits together and I don't want to start from scratch with another program. Making the text bigger helps a little bit but I would really like to keep it the same size for timing and format sake. However, I can't keep it like this because it looks terrible both on an Apple Cinema display and on a tv after burned to DVD.
Thanks!My first guess would be the font itself. Geneva is a pretty thin font and it's not likely to look good interlaced and on a TV. Can you try a thicker font?
I'd also turn off Field Rendering. It looked much worse with it on than off. Keep Frame Blending though.
Andy -
Compress an image before upload
Hi,
Is there a way to compress an image using Flex? the scenario I have is: the user takes a picture from a mobile device camera, I then upload it to the cloud. I'd like to be able to shrink the image in size before I upload it as it takes ages.
Many thanksI believe there are PNG and JPEG encoders
-
Disabling RDP Compression on Windows Server 2012 R2
Hi
On Windows 2008 R2, we could disable RDP compression via GPO by configuring "Do not use an RDP compression algorithm" in the following GPO
Computer Configuration\Administrative Templates\Windows Components\Remote Desktop Services\Remote Desktop Session Host\Set compression algorithm for RDP data
It seems like with the Windows Server 2012 R2 GPO's, this setting is no longer available? How do we disable RDP compression so that we can use it with Riverbed products?
ThanksHi,
How is the issue going now? Is there any update?
Thanks.
Jeremy Wu
TechNet Community Support -
Lync 2013 Front End SIP/2.0 500 Compression algorithm refused
I've deployed a brand new Lync 2013 environment hosted on Windows Server 2012 R2 that is currently in co-existence mode with my Lync 2010 environment.
I have SCOM 2012 monitoring the environment and it recently started reporting that one or more of my front end servers
was in a critical state. Diving into it revealed the following perf counter threshold was being tripped:
Time Sampled: 3/26/2014 2:33:30 PM
Object Name: LS:SIP - Responses
Counter Name: SIP - Local 500 Responses
Instance Name:
First Value: 14287
Last Value: 14340
Delta Value: 53
Using OCSLOGGER.exe on the front end to capture logs, i trapped the following:
TL_INFO(TF_PROTOCOL) [11]9138.1C58::03/26/2014-19:12:39.098.0022c780 (SIPStack,SIPAdminLog::ProtocolRecord::Flush:ProtocolRecord.cpp(265))[120713120] $$begin_record
Trace-Correlation-Id: 120713120
Instance-Id: 7D80EB
Direction: outgoing;source="local"
Peer: poolA.contoso.com:63820
Message-Type: response
Start-Line: SIP/2.0 500 Compression algorithm refused
FROM: <sip:poolA.contoso.com>;ms-fe=FEserver1.contoso.com
To: <sip:poolA.contoso.com>;tag=F8B88CAB38613EB380773027C56D94AF
CALL-ID: 986f9f568c794ce39d33d7158376157b
CSEQ: 1 NEGOTIATE
Via: SIP/2.0/TLS 10.154.228.225:63820;ms-received-port=63820;ms-received-cid=C3D7C00
Content-Length: 0
ms-diagnostics: 2;reason="See response code and reason phrase";HRESULT="0xC3E93C0F(SIP_E_REACHED_CONFIGURED_LIMIT)";source="FEserver1.contoso.com"
Server: RTC/5.0
$$end_record
The only recent change made to the front end servers was making the registry change outlined in this article:
http://support.microsoft.com/kb/2901554/en-us so i'm wondering if that has something to do with it.The MSFT support person said to re-apply CU5 to the Director servers and reboot. Since this is impactful to the environment and I would have to do reboots anyway, I opted to go the route of installing the more recent update so....
Last weekend I updated my Lync environment with what I think is considered CU6, the September 2014 updates for Lync 2013 Server (https://support.microsoft.com/kb/2809243) and still no luck. The front
end servers are fine; no excess SIP 500 errors occurring there but within 30 minutes of removing the SCOM override on the Director servers the alerts started firing again.
I reinstated the override in SCOM for the Directors and had my case with Premier support un-archived. The MSFT support person said if the alerts didn't go away she was going to have to engage the Lync product group for help. We'll see where it
goes from here.
JKuta
Maybe you are looking for
-
I have created a slideshow and burned it onto a DVD but the problem is I cannot get it to start the show immediately without going to that iDVD revolution menu. Secondly I select loop the show on the movie project but when I burn it onto the DVD it
-
Should Goodwill be treated a pure G/L account or an Asset subledger account
Hi, Typically Goodwill is the effect of an Merger and Aqcuisition with regards for investments in Fixed assets. Should it be defined as a pure G/L account and be posted manually in the G/L? Or is this normally posted in the Asset subledger? Your ans
-
How To Set Synchronous Call in FM CRM_SEND_DATA
Hi all, Please help me, how to set i_synchronous_call in FM CRM_SEND_DATA? Thank you
-
I am having trouble talking to Watlow EZ-Zone PM controllers with labview. I cannot get the ENET to RS-485 to show up in Max. I discovered that this is due to 64 bit windows? However I can talk to the ENET to RS-485 using VISA Resource Name: ASRL::xx
-
Officejet all in one 4500 won't turn on
I've had the printer for about 8 months with no issues. It had been networked but we moved so in the new location I was using as a copy machine and only printed occasionally. Yesterday, I decided to put it on our new network (without referring to ins