Collections & offset

Hi,
I'm using a collection to populate a report.
I have a process to update the collection on every submit (On Submit and Before Computation).
There is no problem if the report stand in a single page. But if I have a report, let's say 20 rows standing on 2 pages, I have problems updating the collection.
I'm running a query to retrieve the collection, and then update it with values from the pages textbox (wwv_flow.g_f01, 02 ...). See code below.
I need to find a way to update only rows that are shown in the report (offset or something ..).
Any ideas ?
declare
c pls_integer := 0;
begin
for c1 in (
select seq_id from apex_collections
where collection_name = 'MATRIX'
order by seq_id) loop
c := c+1;
apex_collection.update_member_attribute (p_collection_name=> 'matrix',
p_seq=> c1.seq_id,p_attr_number =>4,p_attr_value=>wwv_flow.g_f01(c));
apex_collection.update_member_attribute (p_collection_name=> 'matrix',
p_seq=> c1.seq_id,p_attr_number =>5,p_attr_value=>wwv_flow.g_f02(c));
apex_collection.update_member_attribute (p_collection_name=> 'matrix',
p_seq=> c1.seq_id,p_attr_number =>6,p_attr_value=>wwv_flow.g_f03(c));
apex_collection.update_member_attribute (p_collection_name=> 'matrix',
p_seq=> c1.seq_id,p_attr_number =>7,p_attr_value=>wwv_flow.g_f04(c));
end loop;
end;
thanks,
Mike

I found a way to do it ....
DECLARE
c pls_integer := 1;
l_total_item NUMBER := wwv_flow.g_f07.count;
BEGIN
FOR c1 in (
select seq_id, c007 as id from htmldb_collections
where collection_name = 'MATRIX'
order by seq_id) loop
IF c <= l_total_item THEN
IF c1.id = wwv_flow.g_f07(c) or c1.id is null THEN
htmldb_collection.update_member_attribute (p_collection_name=> 'matrix',
p_seq=> c1.seq_id,p_attr_number =>1,p_attr_value=>wwv_flow.g_f01(c));
htmldb_collection.update_member_attribute (p_collection_name=> 'matrix',
p_seq=> c1.seq_id,p_attr_number =>2,p_attr_value=>wwv_flow.g_f02(c));
htmldb_collection.update_member_attribute (p_collection_name=> 'matrix',
p_seq=> c1.seq_id,p_attr_number =>3,p_attr_value=>wwv_flow.g_f03(c));
htmldb_collection.update_member_attribute (p_collection_name=> 'matrix',
p_seq=> c1.seq_id,p_attr_number =>4,p_attr_value=>wwv_flow.g_f04(c));
htmldb_collection.update_member_attribute (p_collection_name=> 'matrix',
p_seq=> c1.seq_id,p_attr_number =>5,p_attr_value=>wwv_flow.g_f05(c));
htmldb_collection.update_member_attribute (p_collection_name=> 'matrix',
p_seq=> c1.seq_id,p_attr_number =>6,p_attr_value=>wwv_flow.g_f06(c));
c := c+1;
END IF;
END IF;
END LOOP;
END;

Similar Messages

  • Database, Dataset, Table Adaptors Error "Unable to load, Update requires a valid DeleteCommand when passed DataRow collection with deleted row"

    Microsoft Visual Basic 2010 Express.
    I am new to Visual Basic programing and i am trying to understand the relationships between Datasets, database, table Adaptors. I have to following code that is is giving me the following error" Unable to load, Update requires a valid DeleteCommand
    when passed DataRow collection with deleted rows". 
    I can track the error and its located in "OffsetTableTableAdapter.Update(MaterionOffsetDataSet.OffsetTable)" code. What am i missing?
    It seems that i can delete the data on the DataGridView Table and it only displays the correct data. but my database is not updating, even though the data grid displays differently.I can determine this because, when i save the offset database, i have all
    the previous uploads and all the rows that i wanted to delete are still there.
    My final goal is to be able to import offset data from a CSV file, save this data on the pc, send a copy of this data to a NuermicUpDown so the customer can modify certain numbers. From here they download all the date to a controller.  IF the customer
    needs to modify the imported data, they can go to a tab with a data grid view and modify the table. They will also have to option to save the modified data into a csv file.  
    Im not sure if i am making this overcomplicated or if there is a easier way to program this.
    CODE:
    Private Function LoadOffSetData()
            Dim LoadOffsetDialog As New OpenFileDialog 'create a new open file dialog and setup its parameters
            LoadOffsetDialog.DefaultExt = "csv"
            LoadOffsetDialog.Filter = "csv|*.csv"
            LoadOffsetDialog.Title = "Load Offset Data"
            LoadOffsetDialog.FileName = "RollCoaterOffset.csv"
            If LoadOffsetDialog.ShowDialog() = Windows.Forms.DialogResult.OK Then  'show the dialog and if the result is ok then
                Try
                    Dim myStream As New System.IO.StreamReader(LoadOffsetDialog.OpenFile) 'try to open the file with a stream reader
                    If (myStream IsNot Nothing) Then 'if the file is valid
                        For Each oldRow As MaterionOffsetDataSet.OffsetTableRow In MaterionOffsetDataSet.OffsetTable.Rows
                            oldRow.Delete()                       
    'delete all of the existing rows
                        Next
                        'OffsetTableTableAdapter.Update(MaterionOffsetDataSet.OffsetTable)
                        Dim rowvalue As String
                        Dim cellvalue(25) As String
                        'Reading CSV file content
                        While myStream.Peek() <> -1
                            Dim NRow As MaterionOffsetDataSet.OffsetTableRow
                            rowvalue = myStream.ReadLine()
                            cellvalue = rowvalue.Split(","c) 'check what is ur separator
                            NRow = MaterionOffsetDataSet.OffsetTable.Rows.Add(cellvalue)
                            Me.OffsetTableTableAdapter.Update(NRow)
                        End While
                        Me.OffsetTableTableAdapter.Update(MaterionOffsetDataSet.OffsetTable)
                        MainOffset.Value = OffsetTableTableAdapter.MainOffsetValue          'saves all the table offsets
    to the offset numericUpDown registers in the main window
                        StationOffset01.Value = OffsetTableTableAdapter.Station01Value
                        StationOffset02.Value = OffsetTableTableAdapter.Station02Value
                       myStream.Close() 'close the stream
                        Return True
                    Else 'if we were not able to open the file then
                        MsgBox("Unable to load, check file name and location") 'let the operator know that the file wasn't able to open
                        Return False
                    End If
                Catch ex As Exception
                    MsgBox("Unable to load, " + ex.Message)
                    Return False
                End Try
            Else
                Return False
            End If
        End Function

    Hello SaulMTZ,
    >>I can track the error and its located in "OffsetTableTableAdapter.Update(MaterionOffsetDataSet.OffsetTable)" code. What am i missing?
    This error usually shows that you do not initialize the
    DeleteCommand object, you could check this
    article to see if you get a workaround.
    >> Im not sure if i am making this overcomplicated or if there is a easier way to program this.
    If you are working CSV file, you could use OleDB to read it which would treat the CSV file as a Table:
    http://www.codeproject.com/Articles/27802/Using-OleDb-to-Import-Text-Files-tab-CSV-custom
    which seems to be easier (in my opinion).
    Regards.
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • System error message and application crash - ModName: lvrt.dll Offset: 00080a6a

    I'm running into an error message which is proving particularly difficult to debug. After my built application has been running for some time, a windows system error will pop up telling me my application has done something bad and needs to be shut down (and sorry for the inconvenience). Clicking through to the "see what the error report contains" page reveals the issue as being in the lvrt.dll module at offset 0080a6a. 
    This unfortunately doesn't mean much to me.
    On some occasions when I click "Don't Send", another error message pops up informing me there is "Not enough memory available to complete this operation".
    The program will typically run between 8 and 24 hours before the error occurs, though I've had it run error-free for upwards of 48.
    My program is unfortunately fairly large, and depends on communication with proprietary hardware so can't be uploaded in a functional state. In general terms, the body of the program (which is executing when the error occurs) involves RS-485 serial communication and simultaneous display of data to 7 continuously updating strip charts (producer-consumer design), along with envalope detection and RMS calculations in real time, all of which is multiplied by two parallel channels (around 1Mb/s data rate for each). The data is also buffered and stored in binary files for later review. Binary files are closed and a new one is opened for every 10 minutes of data collection (approx 50MB of data).
    There are also about 40 user controls associated with each channel, and are polled periodically (a couple times a second). The crashes occur during over night tests, however, when there is no user input. Globals are used for inter-panel communication, but no global is written to in more than one location (most are read in multiple locations).
    I know this is a bit of a shot in the dark here, but does the lvrt.dll error at offset 00080a6a mean anything to anyone? Are there any "typical" causes for a failure in lvrt.dll having to do with prolonged high data rate serial communications? For reference, this has occured on multiple machines, but all the machines it has occured on are identical out-of-the-box laptop models running XP 32. I'm going to try to run an overnight test tonight on the development machine and see if I get any different results (Vista 64 vs Windows XP 32).
    I can't say my hopes are high with this one, and I'm gonna continue to comb over the code for any possible conflicts, but any insight would be appreciated.

    Claire,
    Next time the error message comes up I will take a screenshot of it for you. Unfortunately that may not be until Monday, as tonight is the soonest I'll be able to run another long-term test. The time between failures is the most frustrating part to debug!
    There are no errors on compiling or building the application in LabVIEW. My testing has been done mostly on the distribution laptops in application form, with no LabVIEW dev tools installed, Windows XP 32 bit.
    The overnight test I attempted on the development machine (within the LabVIEW environment) welcomed me the next morning with a rebooted machine, and so I wasn't able to see an error message (if there was one). Vista 64 on the development machine. Reviewing the data files, I got about 9 hours worth before the crash.
    Given the time between failures and the large volume of data I'm passing around, this seems like memory leak type behavior. I'm a bit confused by the fact that it seems to fail in lvrt.dll every time, though. I'm don't know what that indicates, if anything.

  • Does Weblogic server 8.1 support "LIMIT and OFFSET" in EJB QL?

    hi,
    Can any one tell me does 8.1 server support "limit" and "offset" keywords in EJB-QL?
    I tried to define a finder method in the workshop with the following syntax:
    @ejbgen:finder ejb-ql="SELECT OBJECT(o) from EmpBean OFFSET ?1 LIMIT ?2" generate-on="Local" signature="Collection findEmp(java.lang.Integer st,java.lang.Integer en)"
    I get the following error
    ERROR: ERROR: Error from ejbc: [EJB:011017]Error while reading 'META-INF/weblogic-cmp-rdbms-jar.xml'. The error was:
    ERROR: Query:
         EJB Name: EmpBean
         Method Name: findEmp
         Parameter Types: (java.lang.Integer, java.lang.Integer)
    SELECT OBJECT(o) from EmpBean OFFSET =>> 1 <<= LIMIT 2
    EJB QL Parser Error.
    39: unexpected token: 1
    What am I doing wrong?
    thanks

    <p>I could not find any documentation that suggested LIMIT or OFFSET where supported. Some of our developers needed to use limit, in the end all I could suggest was using a dynamic query and setting the maximum number of results see here
    .</p>
    <p>
    Hussein Badakhchani</br>
    </p>

  • Batch Change Date/Time - same time to full batch, not offset timing?

    Is there a way to force Aperture to change the date/time on a batch of photos with ALL images to the same exact date/time?
    Right now, I have a bunch of scanned images with the wrong date/time, but of course, scanned at different times (embedded image date/time). When using Aperture 2.1's "Metadata", "Adjust Date and Time..." function, I only have one image with the input desired date/time, then all the other scans have a different date/time based on the timing offset difference from the scanning, which over the course of several scans changes by several days quickly.
    Does anyone know of a work around? I'd hate to manually adjust hundreds/thousands of image metadata by hand when Aperture should do this easily.
    Thanks!!

    Correct, I am referring to the EXIF data. Aperture can change this data using the aforementioned "Metadata", "Adjust Date and Time..." feature, it's only that Aperture can't (as far as I can tell) set a batch of photo to the exact date and time among the entire batch.
    Perhaps I'll be able to find or create an Apple Script to do this outside of Aperture?
    It's a shame though as I can't possibly be the only one with a large scanned collection from years ago I'm trying to archive in proper date order. This would be a great new added feature Apple, if you are also reading this! (I've already submitted the feature request some weeks ago)

  • Daqmx separate multiple physical cahnnels apply equations and collect information

    Dear People!
    I am using a DAQCard 6023E with 16 channels. My purpose is to collect information from a gasifier to find temperature and pressures from 7 of the channels and 4 channels respectively. Forget about the pressure to make the problem more easy.
    The readings we get from the gasifier is in miliV amplified by an intensifier while also it adds an offset. When the readings come inside LabView I get all the 7 channels I pass them from DAQmx Create Channel then I add a Sample clock. I start the task with a DAQmx Start task and I put everything inside a loope where DAQmx Read starts to read all the physical channels. Now I need to reverse engineer the mV (which are amplified) by subtracting the offset and divide by the amplifications to each separate physical channels ( for each channel there is a different offset and amplification). When do this I pass them from a thermolinear module to convert it to V to Temp. And this information then passes to a "Write to measurement file" to have a file with all the inform. Then the while loop closes and we have the DAQmx Clear task and the error (very basic).
    If I had only one physical channel it will be very easy. I will have to use just a Subtract and Divide module to reverse engineer the mV before passing to the thermolinear module.
    The two questions I have is:
    1) How is it possible to separate the the physical channels inside the while loop before or after the DAQmx Read ( I don't really know what is best) to apply for each physical channel the appropriate offset (subtract) and amplifications (divide) and then reconnect them so when they go to the "Write to measurement file" to create one file with all the physical channels and not one file for each physical channels?
    2) The thermo linear module needs to have a cold junction voltage (for the 25 ambient temp). Does this voltage needs to be in V or in miliV as my input is in miliV? While also inside the Thermoliner module it informs me that it needs microvolt (1000miliV) do I need to change this for the module to work?
    The thermocouple I use is a K type with constant value for units (Celsius) IC sensor and Voltage reference. I hope I was not too long to explain you my problem and I hope somebody can help me in this.
    Thank you very much to the whole community and hope to hear from somebody in the near future. I have a deadline and I can not see how I will do this. I am new to labview I am using it only 3 days now.
    Kind Regards
    Alex

    Thank you very much for the response.
    All day I was thinking this and this is exactly what I did with the difference that I used at the end a "build array" module to reconnect all the separate channels.
    1a) Do you think is better to use the "replace array element" module than the "build array" module?
    Also I have problems with the thermolinear module. Is the appropriate module to use to make miliV to Temp. If you look inside the module it says that it needs the voltage to be in microVolt (1000 miliVolt).
    1b) Do I need to change my miliVolts to Volts and then put it inside the thermolinear module? Or do I need to change my miliVolts to microVolts and then to the module?Or I can just let them be miliVolts and LabView will do all the work for me using the thermolinear module
    2b) Does the Cold Junction Voltage needs to be in miliVolts, microVolts or Volts?
    I still can not make the whole system to work but I know I am in the right path
    Thank you and hope to have again some of you knowledge
    Alex

  • Down payment tax offset account

    Hello Experts
                   We have a Down payment tax offset in GL Determination.I have mapped the same but there is no effect in  this account while making AR down payment .I want to avoid tax will collecting down payment,.How can we manage?
    Regards
    Manoj

    Hi Manoj,
    Do you solved this issue yet ?
    Regards,
    Dwarak

  • Validate Offsetting Account (GKONT) for Asset

    Hi all,
    Is there a way to programatically validate that an offsetting account (GKONT) is valid for an existing asset (ANLN1)?
    That is, I am trying to emulate transaction ABZON and I need to enter an asset, ANLN1, and then an offsetting account, GKONT, that needs to be valid. I know there is a collective search help but 1) I wouldn't know how to use it in a program and 2) it will also give invalid choices (i.e. using one of the results will not allow saves due to error).
    Any ideas? Point will be awarded for helpful answers.
    Thanks in advance.

    Hi,
       Check against SKB1 that:-
           a) SAKNR exists for the company code
           b) That SKB1-MITKZ = Space   (is not a reconcilliation acount).
    Basically Any Non-Reconcilliation G/L Account is valid for ABZON.
    Further Validations would depend on the system configuration and what the users consider a valid account to be.

  • Basic questions about permanent Collections

    I am just learning about Collections, discovering that a Collection preserves all of the work I done in the collection. I am assuming then, that the images in a collection are virtual copies that can be adjusted in the Develop Module to be different than they would be displayed in the general catalog under the original imported folders. I also assume that a modified image cannot be locked to prevent further changes, say if another virtual copy were to be made.
    I gather that Collections need to be protected somehow, for instance if I have one collection with images adjusted to look good on the web, and another collection of the same images adjusted to look good using a certain printer.
    Maybe I will never need to adjust images for different applications, except outside of Lightroom for CMYK applications in offset printing. Since various settings for Slide Show, Web and for Print can all be contained in one collection, it might probably be best to use the same images without changing the images for each output.
    Can some of you who are more experience add thoughts here to how best to use Collections. My goal is to keep things simple.
    TIA,
    Ken

    There are not enough gold stars to go around but you all deserve them for the helpful answers.
    I see on pg 175 of the Lightroom Classroom In A Book (in which there are also an unusual amount of typos) that "In the Quick Describe metadata set, the Metadata panel shows the Filename, Copy Name (if the image is a virtual copy..." This is more evidence that Lightroom and the system of Metadata is quite well developed itself, including the 'Copy Name' for a virtual copy. I have found already, in my short experience with Lightroom, that I want to work on a Virtual Copy when I'm not quite sure of the direction I want to take with a photo's adjustments. By retaining the Master 'as is' (with modest adjustments) I can always quickly go back to view the starting point (best image with modest adjustments) while I continue to work on a Virtual Copy for more experimental adjustments.
    On the one hand, my shooting is going on hold while I learn Lightroom, but I feel it is a worthwhile investment to become as familiar as possible with Lightroom's capabilities. It seems to add more to my desire to capture images more accurately, because I know I have so much to work with once I get the images back into the studio. Crisper images with optimal lighting will make me as happy as a pig in mud when I get them into Lightroom.

  • Initial Offset Storage

    This is my first Labview app and I was denied the training course due to budget concerns, so this may be an elementary question.  I tried the online resources and could not make enough sense out of things to do what I need to do.
    I am acquiring four load cell inputs from a CDaq system with a NI-9237 strain input module.  I am getting the information using the Daq Assistant, and then I am filtering the data.  I have graphs of the un-filtered and the filtered data, as well as a numeric display of the current filtered value.  I have hit a wall at the next step.  I need to capture a set of values (one for each channel - 4 total) at the same time (with a button on the front panel) to be stored as an offset for use in calculations.  I found an example that had a kind of similar activity, but I was unable to modify it to work for me. 
    Once I have the offset values I would like to subtract this offset from the filtered values before graphing and display.  I will then use the compensated values in a formula to determine an induced force which I would like to display.  I will then write the 4 compensated values and the resulting induced force to a TDMS file for use later in Diadem.
    Any help would be greatly appreciated.  Also please include any comment on the code that I am attaching because I may not be doing things very efficiently.
    Thanks,
    Wayne
    Attachments:
    Trailer Load Cell External Excitation.vi ‏143 KB

    I think I understand a little better now. Since you're always reading the same channels you don't have to worry about dealing with different DAQ tasks. You will need to have a more structured program. I would suggest starting off with a state machine architecture, although you might find the producer-consumer architecture to be the most flexible. There are a number of functional and user requirements that you will need to define. For example:
    Is there a possibility that the gathering of the offsets and the actual monitoring is separated in time by a lot, and could have a computer shutdown in between while the second trailer is connected?
    Should the offsets be saved to file. When the program start up you will probably want to read the offsets from file so you have something to start with. These would be the last measured offsets, and may be valid ones to use. In essence, you would save the calibration values.
    Do you need to have a human confirm the offsets to use? In other words, is there some "leveling off" of the values, so that you take the readings only when they've "settled"?
    Data collection rate. This will need to be known fairly early on since it will impact your software architecture and data collection algorithm.
    These are just a few things off the top of my head. Since you don't have that much experience with LabVIEW you may want to consider looking at local NI Alliance members who are experienced in dealing with these kinds of applications (both in terms of hardware and software). If you contact your local NI rep they can put you in touch with someone in your area.

  • COLLECT_OVERFLOW_TYPE_P  occured,collecting amount in GLT3 table ksl07 fied

    Runtime Errors         COLLECT_OVERFLOW_TYPE_P
    Except.                CX_SY_ARITHMETIC_OVERFLOW
    Date and Time          27.07.2009 18:31:32
    Short text
         An internal table field has been defined too small.
    Error analysis
         An exception occurred that is explained in detail below.
         The exception, which is assigned to class 'CX_SY_ARITHMETIC_OVERFLOW', was not
          caught in
      1506     ASSIGN TAB_GLT3-KSLVT INCREMENT INT_GLS3_ADD-OFFSET TO <FELD>
      1507                           RANGE TAB_GLT3 CASTING TYPE P.
      1508     MOVE INT_GLS3-KSL TO <FELD> .
    >>>>>     COLLECT TAB_GLT3 .
      1510   ENDLOOP.
      1511
      1512
      1513   IF SY-ONCOM = 'P'.
      1514     PERFORM NEW_UPDATE_GLT3 .
    ABOVE EROOR clearly says  that The name of the field is "KSL07" and TABLE is GLT3 is exceeding ,
    but we checked the file as well table , their is no large value in SAP . file is uploading using RFBIBL00 program
    Please advise why this type of errors occurs.
    Thanks

    HI,
    The amount which is getting calculated is too big to be accomodated in the field KSL07.
    Check whether the decimal places are being considered properly.
    Regards,
    Ankur Parab

  • Problem with embedded font on offset printing

    Hello,
    I generated a leaflet with InDesign CS5. I exported it as PDF/x-3:2002. In the document, I used quote signs (American Typewriter Light font). When I export the PDF, everything goes well, and when I check in Acrobat, the font seems correctly embedded. However, when I picked up the leaflets at the printing house today (it's an offset printing), all the quotation marks were not printed (squares instead). Where does the problem come from?? Did the printing people did a mistake and what mistake? Or is it me?
    Thanks for your help!!

    Unfortunately, I haven't been able to respond to this thread up until now, but having reviewed this full thread, I have a few comments to offer:
    It is irrelevant whether the original text was copied and pasted or whether the font was Type 1, TrueType, a TrueType collection, OpenType CFF, etc. The real issue is whether the PDF/X-3:2002 file exported from InDesign is indeed valid. This can very easily be verified by running the appropriate preflight profile for PDF/X-3:2002 in Acrobat Pro 10 or 11. There is no need for any third party plug-ins or any other software for this.
    Assuming that preflight pronounces the file to be 100% kosher with no exceptions and that the file displays properly in Acrobat Pro, you should then try printing the file (or even just the pages that exhibited the problem) from Acrobat Pro to a printer that supports Adobe PostScript 3.
    If the printed output from the PostScript printer is correct as well, the problem lies totally with your print service provider and their workflow.
    In our extensive experience, problems can occur in any number of ways. One of most frequent sources of problems are printers who routinely open PDF files in Adobe Illustrator to view and “fix” them, somehow thinking that Adobe Illustrator is a general purpose PDF editor. For the record, Adobe Illustrator is not, repeat not, repeat yet again not a general purpose PDF file editor. Adobe Illustrator can only successfully edit PDF files saved from Adobe Illustrator itself when the “save editability” option is specified. Even then, Illustrator requires you to have all fonts referenced by the PDF file actually installed on your system; it does not use any fonts embedded in the PDF file. For other PDF files (such as those produced by Distiller, export from InDesign, etc.), character substitutions (especially for non-ASCII characters such as typographical quotes), font substitutions, color space changes, etc. are common issues.
    If the printer is not abusing Illustrator on your InDesign-exported files, it could be any number of third party products that many printers use, some of which are of dubious value and some of which may be either defective in terms of compliance with the PDF specification or simply grossly out-of-date. (It is amazing how many print service providers put their customers' submitted PDF files through all sorts of “fixup” processes, regardless of whether the files need such fixups or not. Many of these “fixups” are more likely to cause problems than they are to fix anything!)
    Under no conditions should you need to fully embed fonts in your PDF files. Certainly no Adobe software, including applications, Acrobat, and our PostScript and PDF RIP technologies licensed to OEMs is any way sensitive to whether fonts are subsetted or not! If a third party workflow component is so-sensitive, it is by definitiion defective.
    In terms of imposition, there should likewise be no sensitivity to whether fonts are subsetted or not. By the way, the most reliable imposition is actually done at the RIP itself, not by programs that either place PDF files into other documents and regenerate PDF or by utilities that rearrange PDF files or create new PDF files representing the printed flats.
    This thread also discussed the issue of whether the OP received a “proof.” Unless you receive a hard copy proof in which the content was put through the exact same workflow as the final product, including RIP and all prepress steps, it is very possible that the defect noted would not show up on the proof. Nonetheless, obtaining a hard copy proof is an excellent idea, especially if the issue comes up about liability for the cost of reprinting the job or even whether the original job should be paid for if the print service provider can't “fix” their workflow problems!
              - Dov

  • Book Module-Export saved book from collections--Book is not there?

    LR6. I have been working on a book and had to travel and wanted to finish the book while on the road. I saved the book which put it in the collection I was working on. Exported the collection onto a portable disk. However, when I loaded it on my laptop, the book is not there????? The images are there, but NO BOOK. Do I have to start over.

    Are we looking at printed or screen proofs here?
    Not that I have an answer for you but sharpening might not be visible on screen if that is what you are comparing. Sharpening for print should differ for print processes from sharpening for screen. The 'print' sharpening, presumably for inkjet although it should be for offset, will be related to final image size, so viewing on screen at a potentially different size and resolution might be obscuring what would occur in a printing process.
    I can't verify since I am still using LR3.6.

  • Variable Offsets

    I have got a query in Query Analzyer with 2 key figures both looking at the same thing (Number of Orders).
    I would like the 1st key figure to be the value for the current month (based on a variable entry - i.e. 09.2009) and the 2nd value to be the last 3 months based on the variable entered for the 1st Key Figure (i.e. 07.2009 - 09.2009).
    I have tried restricting the 1st Key Figure with the variable 0I_CMNTH on 0CALMONTH and then restricting the 2nd key figure with a value range of [0I_CMNTH -2 - 0I_CMNTH] but this comes up with the following error:
    Variable 0I_CMNTH (5E4II2344I3JG4R6ACMYQ441Z) was found in the query but in the meantime it has been deleted, or it has been used incorrectly.
    It this possible & if so how would I go about doing it?
    Any help would be greatly recieved.
    Kind Regards
    Carly

    Hi,
    You can achive this by using Offset variable.
    If you are using 0calday create a offset variable -90 days( use Calday and Calday - 90 in the  selection i mean you need to use the same variable twice )
    If you are using 0CALWEEK create a Offset varialbe -12 ( use Calday and 0Calweeks - 12 in the  selection i mean you need to use the same variable twice )
    and restrict them with the Key figure
    and about the error ( relese the request in the transport collection (se09) and delete the variable and recreate it once again
    or
    else create a customer exit vatiables var1 and var2 one with user input and another without user entry
    write a logic in EXIT_SAPLRRS0_001 . compare with Sy-datum and -90
    use LOWVALUE - 90 and use I_step=2 so that at the run time it calculates current date - 90 days and gives you current date and 3 months back date
    hope this helps
    santosh

  • Collections Best Practices?

    Run one ASCP plan for 6 different inventory organizations on a centralised instance 5 nights a week. Run a complete refresh prior to each run. Is this the way most people run their business? Add new items, bom, routing supplier and customers on a daily basis. Would love to improve the performance of the overall process. Unconstrained plan in a discrete configure to order environment.
    Thanks in advance

    Stick, a few more points.
    ASCP will collect inactive items because the MRP Planning Method is not status controlled.
    If you have a large number of inactive items, you should make them all NOT PLANNED. That will stop ASCP from collecting those items/boms/routings etc.
    If sourcing splits are not that important to you, you can consider setting Recalculate Sourcing History parameter to NO in the Planning ODS load. Depending on your procurement history, it will have a huge impact on your ODS load process. If sourcing split is kinda important, you can consider run ODS load with YES for one day a week while leaving it NO on other days.
    Also see if MSC: Sourcing History Start Date Offset (in months) is set too high. If it is, you are collecting a large amount of history.
    If you don't do much with resources, make Recalculate Resource Availability parameter = NO.
    Check if you are collecting more orgs than necessary. Also, typically, there may not be a need to collect master org unless you do item substitution.
    You can also set MSC: Purge Stg Tbl Cntrl to Yes to speed up ODS Load.
    You can also run Targetted data collections during weekdays (and avoid collecting semi-static entities such as planners, calendars, ATP Rules, UOMs etc) and run a Full DC on weekends.
    Hope this helps,
    Sandeep Gandhi

Maybe you are looking for

  • Multiple instances spawned after daylight savings time change

    This is a known issue and there is an SAP Knowledge Base Article created on this issue: 1448881 - Multiple instances spawned after daylight savings time change Below is the information from the SAP Knowledge Base Article: Symptom Scheduled reports th

  • MM02 Upload Problem in BDC in Quality

    Hello Experts , Currently I have situation here with BDC for MM02 . In sandbox I have recoded for only Quality Management View (which is at the 10th position). For all material types this is working fine. But , when in Quality Server , the position o

  • ALM Excel Add in error

    Product – ALM OS – Windows 7 MS Office version – 2007 I have ALM -excel add in installed in my MS excel. When I click on 'export to ALM', an error comes up – "compile error in hidden module:CTDServer" I went into the VB editor and found that the OTA

  • How do I re-enable the Administrator's account in Windows Server 2008?

    I can not log onto my server because my administrator account is disabled. But i know the password of server. The server not join to domain. i dare not try any way, i want to know the solution for make sure because this server run many services. Plea

  • Go back from android to bb on bb id

    I have a bb 9320 and I use it normally, or at least did until early December. You see, my operator was going to cut my services because I hadn't pay. That's ok. However, I didn't want to stop using bbm even if it was for a short period of time. So, I