Bulk data export
Hi,
Our users need export large data based on ambiguous attribute queries. For example, export emails of customers to send promotions. This kind of usage cannot build with dimensional model. I find Endeca is a good tools for our scenario. The only question is whether Endeca supports large data export.
Thanks
Cheney,
EID Studio has components like the results table or results list that support record export. You can read more about this functionality in the StudioUsersGuide found here: http://docs.oracle.com/cd/E29805_01/StudioUsersGuide.pdf
It is important to understand what specific volumes are expected for your "large data exports". Studio has safeguards which control maximum allowable export sizes, both in terms of endeca records (for results tables showing raw endeca records and analytic recoreds (for results tables employing EQL to aggregrate the endeca records to the grain of some other entity). These safeguard limits can be found in the "Framework Settings" section of the Studio control panel. Relaxing these defaults to allow for larger export sizes can put extreme memory and CPU load on both Studio and the Endeca Server, so it is important to understand the performance implications of this approach.
Thanks,
Dan
Similar Messages
-
Best way of uploading bulk data in a Z tables
Greatings...
Actually we have to maintain Bulk test data in new system..so where we need to upload bulk data in TABLES . most of them are 'Z' tables....
Please suggest me which way is the best way to proceed in this case.....
We came with one solution like RUN A eCatt. is this the only way or we have any other......
Thanks in Advance...Hi
Check this sample code..
TYPE-POOLS truxs.
TABLES : ZACG_BOMM.
parameter p_file TYPE rlgrap-filename DEFAULT 'E:\TEST.xls'.
TYPES: BEGIN OF t_tab,
CBSIZE TYPE ZACG_BOMM-CBSIZE,
SDESC TYPE ZACG_BOMM-SDESC,
MENGE TYPE ZACG_BOMM-MENGE,
MEINS TYPE ZACG_BOMM-MEINS,
LDESC TYPE ZACG_BOMM-LDESC,
END OF t_tab.
DATA : IT_upload TYPE STANDARD TABLE OF t_tab,
wa_upload TYPE t_tab,
it_type TYPE truxs_t_text_data.
AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
CALL FUNCTION 'F4_FILENAME'
EXPORTING
PROGRAM_NAME = SYST-CPROG
DYNPRO_NUMBER = SYST-DYNNR
field_name = 'P_FILE'
IMPORTING
file_name = p_file.
START-OF-SELECTION.
CALL FUNCTION 'TEXT_CONVERT_XLS_TO_SAP'
EXPORTING
I_FIELD_SEPERATOR =
I_LINE_HEADER = 'X'
I_TAB_RAW_DATA = IT_TYPE
I_FILENAME = P_FILE
TABLES
I_TAB_CONVERTED_DATA = IT_UPLOAD[]
EXCEPTIONS
CONVERSION_FAILED = 1
OTHERS = 2
IF SY-SUBRC <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
END-OF-SELECTION.
LOOP AT IT_UPLOAD INTO WA_UPLOAD.
ZACG_BOMM-CBSIZE = WA_UPLOAD-CBSIZE.
ZACG_BOMM-SDESC = WA_UPLOAD-SDESC.
ZACG_BOMM-MENGE = WA_UPLOAD-MENGE.
ZACG_BOMM-MEINS = WA_UPLOAD-MEINS.
ZACG_BOMM-LDESC = WA_UPLOAD-LDESC.
MODIFY ZACG_BOMM.
ENDLOOP. -
Essbase Data Export not Overwriting existing data file
We have an ODI interface in our environment which is used to export the data from Essbase apps to text files using Data export calc scripts and then we load those text files in a relational database. Laetely we are seeing some issue where the Data Export calc script is not overwriting the file and is just appending the new data to the existing file.
The OverWriteFile option is set to ON.
SET DATAEXPORTOPTIONS {
DataExportLevel "Level0";
DataExportOverWriteFile ON;
DataExportDimHeader ON;
DataExportColHeader "Period";
DataExportDynamicCalc ON;
The "Scenario" variable is a substitution variable which is set during the runtime. We are trying to extract "Budget" but the calc script is not clearing the "Actual" scenario from the text file which was the scenario that was extracted earlier. Its like after the execution of the calc script, the file contains both "Actual" and "Budget" data. We are not able to find the root cause as in why this might be happening and why OVERWRITEFILE command is not being taken into account by the data export calc script.
We have also deleted the text data file to make sure there are no temporary files on the server or anything. But when we ran the data export directly from Essbase again, then again the file contained both "Actual" as well as "Budget" data which really strange. We have never encountered an issue like this before.
Any suggestions regarding this issue?Did some more testing and pretty much zoomed on the issue. Our Scenario is actually something like this "Q1FCST-Budget", "Q2FCST-Budget" etc
This is the reason why we need to use a member function because Calc Script reads "&ODI_SCENARIO" (which is set to Q2FCST-Budget) as a number and gives an error. To convert this value to a string we are using @member function. And, this seems the root cause of the issue. The ODI_Scenario variable is set to "Q2FCST-Budget", but when we run the script with this calculation function @member("&ODI_SCENARIO"), the data file brings back the values for "Q1FCST-Budget" out of nowhere in addition to "Q2FCST-Budget" data which we are trying to extract.
Successful Test Case 1:
1) Put Scenario "Q2FCST-Budget" in hard coded letters in Script and ran the script
e.g "Q2FCST-Phased"
2) Ran the Script
3) Result Ok.Script overwrote the file with Q2FCST-Budget data
Successful Case 2:
1) Put scenario in @member function
e.g. @member("Q2FCST-Budget")
2) Results again ok
Failed Case:
1) Deleted the file
2) Put scenario in a substitution variable and used the member function "@member("&ODI_Scenario") and Ran the script . *ODI_SCENARIO is set to Q@FCST-Budget in Essbase variables.
e.g. @member("&ODI_SCENARIO")
3) Result : Script contained both "Q1FCST-Budget" as well as "Q2FCST-Budget" data values in the text file.
We are still not close to the root cause and why is this issue happening. Putting the sub var in the member function changes the complete picture and gives us inaccurate results.
Any clues anyone? -
Automatise data export from website
Hello,
I stuck with an issue regarding data export from scopus.com. I have 61 keywords for which I want to export (as csv) number of documents by year, country, document type, subject area. Doing that by hand would be extremely tedious. Thus my question if this can be automatised. Unfortunately scopus does not provide an API to access the data directly.
So, the process to be automatised would be:
1. Open http://www.scopus.com/home.url
2. Enter the keyword in "search for"
3. Click on "search"
4. Click on "Analyze results"
5. Click on tab "year" and "export". Save the file
6. Click on tab "country" and "export". Save the file
6. Click on tab "document type" and "export". Save the file
6. Click on "subject area" and "export". Save the file
Do programs exist that help me retrieving the data?
Thanks!You could achieve your goal with bash and wget with some combination of --save-cookies --post-data.
But be aware that scopus is run by Elsevier, whom many view as the most evil of the publishing companies. Think twice before doing something that may annoy them. -
Error while retrieving bulk data from mdm in staging server
I am getting an error while retrieving bulk data from MDM. The error goes like this "NiRawReadError: An error occured when reading data from socket.".
Could anyone please suggest me about the possible cause of this error. Please reply soon.
Moderator message: network error, relation to ABAP development unclear, please search for available answers, check with system administrators, consult SAP service marketplace.
Edited by: Thomas Zloch on Nov 22, 2010 5:16 PMCan you elaborate the thing... I don't think /*+ APPEND */ this is working for me ,still I am getting same error.
If you have any other suggestion,I would like to hear.
Should i not put commit after some 500 records inserted ? As i am putting commit once after whole data gets inserted. -
Unable to stop incorrect date exports
How do we set up a form in Adobe Acrobat XI that allows dates to be formatted a certain way (mmm/dd/yyyy) and exported in the same way to Excel and always be recognized as a "proper" date in Excel?
Currently the following does not work (Attempt #1):
Set up a field; Set the format category as date; Set up a Custom format of "mmm/dd/yyyy"
Create a distribution file
When users fill out the form if they type in an incorrect date, eg., "August 27 2013", the form automagically shows the date on the PDF as "Aug/27/13" - Great!
When the users submit the form and it's brought into the response file the dates are shown in a default date format of mm/dd/yyyy - Fine, once the form owners understand this
When the form owners export the information the data exported is the same as the original users entered it, not as it was automagically formatted to. For instance, if submitters originally entered "August 27, 2013" then that's what goes across to Excel. And some of these formats Excel doesn't know how to convert. - Understandably frustrating for form owners
Attempt #2: As a workaround we set up special formatting that has a mask of "AAA/99/9999". This at least provides forces the users to use the same formatting, but is confusing the submitters when they need to enter dates from 1-9 and we've also found that the conversion of this format to a date in Excel doesn't work, but at least it's consistent! Javascript was also added to force users to use specific month abbreviations.
d = new Date(event.value.replace(/-/g, " "));
if (!Date.parse(d)){app.alert("Please enter in a proper date.\n\nThe month abbreviations are: Jan, Feb, Mar, Apr, May, Jun, Jul, Aug, Sep, Oct, Nov, Dec");}
Attempt #3: The last attempt was to continue to use the set up as Attempt #1, but to also use the javascript from Attempt #2. The theory being that if a user entered in "August 27 2013" the javascript would complain. Alas, the javascript appears to run after Adobe automagically does its date format conversion.
Does anyone know how to get around this or have any other ideas to either enforce a usable date format or have Adobe export the dates as they've been automatically formatted to? We've tried to find a way to turn off the automatic date conversion that Adobe's running, but haven't found a way yet. Another option seemed to be to allow a masking that allowed for optional characters (so that the "0" wouldn't be needed for the dates 1-9) but there doesn't seem to be one.
Thanks in advance!Since there was no clear way to ensure that the date formatting was correct prior to exporting, we're going to get the respondants to use drop downs to ensure the formatting is correct. Not the most convenient for the users though as they're accustomed to being able to type in the values to select it (e.g., for the date of 23 they would expect to enter 2 then 3 for 23) based on other applications, but the Adobe pull downs don't "group" what's been entered (e.g., 2 then 3 will select 30, not 23) and so it will take them a bit to get used to it. I still can't believe that Adobe wouldn't simply export what it's been formatted to though... after all that's what we set the form up for.
-
Performance problems on bulk data importing
Hello,
We are importing 3.500.000 customers and 9.000.000 sales orders from an
external system to CRM system initialy. We developed abap programs
those use standart "bapi" functions to import bulk data.
We have seen that this process will take a lot of time to finish
approximately in 1,5 months. It is a very long time for us to wait it
to be finished. We want to complete this job about in a week.
Have we done something wrong? For example, are there another fast and
sap standard ways to import bulk partners and sales orders without
developing abap programs?
best regards,
Cuneyt TektasHi Cuneyt,
SAP standard supports import from external source. You can use XIF adapter or you can also use ECATT.
Thanks,
Vikash. -
Data Export error when migrating cucm cluster on version 7.1.5 to cucm 10.0
Hi
Has anyone come across below? If so any suggestions for workaround?
01, 2014 11:54 PDT
STATUS
The task has been scheduled.
Oct 01, 2014 11:54 PDT
INFO
Export task action ID #154 with 1 node(s) scheduled.
Oct 01, 2014 11:54 PDT
STATUS
The task has started.
Oct 01, 2014 11:54 PDT
INFO
Export task action ID #154 with 1 node(s) started.
Oct 01, 2014 11:54 PDT
INFO
Export job for node xx.xx.xx started.
Oct 01, 2014 12:09 PDT
ERROR
Data export failed for node xx.xx.xx.
Oct 01, 2014 12:09 PDT
ERROR
Export job for node xx.xx.xx failed.
Oct 01, 2014 12:09 PDT
ERROR
1 nodes(s) in Export task action ID #154 failed: xx.xx.xx
Oct 01, 2014 12:09 PDT
ERROR
Task paused due to task action failures.Hi,
you can login PCD through putty for seeing the logs
file list activelog tomcat/logs/ucmap/log4j/ detail date
further, run
file tail activelog tomcat/logs/ucmap/logs4j/ucmap00001.log[for example]
regds,
aman -
Optimization for bulk data upload
Hi everyone!
I've got the following issue:
I have to do a bulk data upload using JMS deploy in a glassfish 2.1, to process and validate data in a Oracle 10g DB System before it is insert.
I have my web interface that loads a file and then delegates the process to a Stateless Session Bean which read a N number of lines and after that send a message to a JMS Queue. The JMS has to parse the each line, validate them with the data already in the DB, and finally persist the new data.
This process is high process consuming, and I need to improve the performance time. I tried to change glassfish default JMS and JDBC pool size, but I have no a huge difference.
Do you have any advice that could help me?
Thanks in advance!Hi! thank you for you answer!
High process consuming is in the MDB
I'm grouping each N number of read lines in the EJB and then send the message to the JMS. The MDB process the persists each line as info in different related tables.
Thanks again! -
PeopleSoft CS SAIP Announce Status Issue in Bulk Data Exchange Status
XML is generated in the provided Directory Path under SAIP “Web Service Targets” but “Announce Status” is blank under Bulk Data Exchange Status, Even the “Event Message Monitor” shows nothing!
We have activated all SAIP service operations and their delivered routings on our side.
The Transaction status under Bulk Data Exchange Status page says Announced but but “Announce Status” is blank on the same page.
Announce status should have any of these possible values per PeopleBooks (Connector Error,Failure,Processing,Success,Unsupported)
What could be wrong? Please help. Thank You...
Regards,
AshishYou are welcome. I'm glad you got it back up.
(1) You say you did the symbolic link. I will assume this is set correctly; it's very important that it is.
(2) I don't know what you mean by "Been feeding the [email protected] for several weeks now, 700 emails each day at least." After the initial training period, SpamAssassin doesn't learn from mail it has already processed correctly. At this point, you only need to teach SpamAssassin when it is wrong. [email protected] should only be getting spam that is being passed as clean. Likewise, [email protected] should only be getting legitimate mail that is being flagged as junk. You are redirecting mail to both [email protected] and [email protected] ... right? SpamAssassin needs both.
(3) Next, as I said before, you need to implement those "Frontline spam defense for Mac OS X Server." Once you have that done and issue "postfix reload" you can look at your SMTP log in Server Admin and watch as Postfix blocks one piece of junk mail after another. It's kind of cool.
(4) Add some SARE rules:
Visit http://www.rulesemporium.com/rules.htm and download the following rules:
70sareadult.cf
70saregenlsubj0.cf
70sareheader0.cf
70sarehtml0.cf
70sareobfu0.cf
70sareoem.cf
70sarespoof.cf
70sarestocks.cf
70sareunsub.cf
72sare_redirectpost
Visit http://www.rulesemporium.com/other-rules.htm and download the following rules:
backhair.cf
bogus-virus-warnings.cf
chickenpox.cf
weeds.cf
Copy these rules to /etc/mail/spamassassin/
Then stop and restart mail services.
There are other things you can do, and you'll find differing opinions about such things. In general, I think implementing the "Frontline spam defense for Mac OS X Server" and adding the SARE rules will help a lot. Good luck! -
What rights are needed to do a Data Export?
I would like someone from our help desk to be able to do Data-->Export
in Console1 to export inventory data regularly. When they try this,
they immediately get the following error:
Data Export will not proceed. Unable to identify the type of
installation.
I saw TID 10088974 but I don't think it applies because if I login to
the same PC, it works. (Running Console1 locally).
I also saw another TID indicating user needs Browse rights to the
ZEN_invDatabase object. [Public] does have rights to this.
What rights do I need to grant?
-Marc JohnsonOn Wed, 15 Jun 2005 22:22:40 GMT, Marc Johnson wrote:
> What rights do I need to grant?
they will need the read right to the properties of the database object...
Marcus Breiden
Please change -- to - to mail me.
The content of this mail is my private and personal opinion.
http://www.edu-magic.net -
How to insert bulk data into ms-access
Hi,
I am trying to insert bulk data into ms-access. I used Statement it is
working fine but not allowing to insert single quote. Then I tryed with
PreparedStatement which is allowing single quote but not allowing bulk data. The following error i am getting.
javax.servlet.ServletException: [Microsoft][ODBC Microsoft Access Driver]String data, right truncated (null)
please help me..
guruhave u tried out the memo datatype in access?
-
BPC10 - Data manager package for dimension data export and import
Dear BPC Expers,
Need your help.
I am trying to set up a data manager package for first time to export dimension - master data from one application and import in another application ( both have same properties) .
I created a test data manager package from Organize > add package > with process chain /CPMB/EXPORT_MD_TO_FILE and Add
In the advance tab of each task there are some script logic already populated. please find attached the details of the script logic written under each of the tasks like MD_Source, concvert and target .
I have not done any chnages in the script inside the task .
But when i run the package , I have selected a dimension 'Entity' but in second prompt ,it ask for a transformation file , and syatem autometically add the file ... \ROOT\WEBFOLDERS\COLPAL\FINANCE\DATAMANAGER\TRANSFORMATIONFILES\Import.xls
I have not changed anything there
in the next prmpt , it ask for a output file ..and it won't allow me enter the file name .....i
Not sure how to proceed further.
I shall be greatfull if someone guide me from your experiance how to set up a simple the data manager package for master data export from dimension . Should I update the transformation file in the script for import file and output file in the advance tab. how and what transformation file to be created and link to the data manager package for export / import .
What are the steps to be executed to run the package for exporting master data from dimension and import it another application .
Thanks in advance for your guidance.
Thanks and Regards,
Ramanuj
=====================================================================================================
Detals of the task
Task : APPL_MD-SOURCE
(DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
(OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
(RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
(%TEMPNO1%,%INCREASENO%)
(%TEMPNO2%,%INCREASENO%)
(/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
(/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
(/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
(/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
(/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
(/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
(/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
Task : EXPORT_MD_CONVERT
(DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
(OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
(RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
(%TEMPNO1%,%INCREASENO%)
(%TEMPNO2%,%INCREASENO%)
(/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
(/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
(/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
(/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
(/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
(/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
(/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
Task : FILE_TARGET
(DIMENSIONMEMBER, %DIMENSIONMEMBERS%, "Please select dimension", "Please select members", %DIMS%)
(TRANSFORMATION,%TRANSFORMATION%,"Transformation file:",,,Import.xls)
(OUTFILE,,"Please enter an output file",Data files (*.txt)|*.txt|All files(*.*)|*.*)
(RADIOBUTTON,%ADDITIONINFO%,"Add other information(Environment,Model,User,Time)?",1,{"Yes","No"},{"1","0"})
(%TEMPNO1%,%INCREASENO%)
(%TEMPNO2%,%INCREASENO%)
(/CPMB/APPL_MD_SOURCE,SELECTION,%DIMENSIONMEMBERS%)
(/CPMB/APPL_MD_SOURCE,OUTPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,INPUTNO,%TEMPNO1%)
(/CPMB/EXPORT_MD_CONVERT,TRANSFORMATIONFILEPATH,%TRANSFORMATION%)
(/CPMB/EXPORT_MD_CONVERT,SUSER,%USER%)
(/CPMB/EXPORT_MD_CONVERT,SAPPSET,%APPSET%)
(/CPMB/EXPORT_MD_CONVERT,SAPP,%APP%)
(/CPMB/EXPORT_MD_CONVERT,OUTPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,INPUTNO,%TEMPNO2%)
(/CPMB/FILE_TARGET,FULLFILENAME,%FILE%))
(/CPMB/FILE_TARGET,ADDITIONALINFO,%ADDITIONINFO%))
================================================================================1. Perhaps you want to consider a system copy to a "virtual system" for UAT?
2. Changes in QAS (as with PROD as well) will give you the delta. They should ideally be clean... You need to check the source system.
Another option is to generate the profiles in the target system. But for that your config has to be sqeaky clean and in sync, including very well maintained and sync'ed Su24 data.
Cheers,
Julius -
Data Export to Oracle table from essbase issue
Hello,
i am using a data export calc script to load data from essbase into an oracle table. We have Essbase 11.1.2.1 on windows 2008 64 bit R2 server. I have an ODBC system DSN created for this job.
However, when i launch this process i get a message in the log "Cannot read SQL driver name for [Backup Exec Catalogs] from [ODBC.INI]"
i have checked the ODBC.ini file also in C:\Windows directory and that file contains the connection entries...
Any thoughts as to why i am getting this error in one server whereas the same process on SIT server is working fine.
thanks...please restart the application and try again . Check the application log if any details are there
-
Hi all,
I'm working at a client where we upgraded from 4.6 to 6.0. This client has never used the master data export program.
I'm trying to configure the master data IDOCs, but I'm having an error when running the program RPCEMDU0_CALL. It refers to an OSS note back in 2001, so that really does not apply to this installation. The problem seems to be with function module HR_PU12_UPDATE_T532K_T532L on IT0106. Some sort of data dictionary discrepancies. I haven't been successful in finding any newer notes related to this.
Has anyone experienced this problem before?
Thanks in advance for your response.
CesarHi Bernd,
Thanks for your response. I looked at that OSS note but it doesn't seem to be realted to the problem I'm having.
The error reads:
"Export program terminated A009
Export program obsolete, regeneration required
Do not regenerate immediately: First make a $STRUC_P0106 back up of the include
PERNR N 000008 000000
(See note 375108) This include is very important if old cluster IF records"
I've re-activated the program and interface formats, but still getting the same error.
Thanks again,
Cesar
Maybe you are looking for
-
Export to Excel and Save as Static File
Hello all, When I export to excel from SharePoint 2013, and save the file, the data remains dynamic in that if I update the list in SharePoint it will update the data in Excel. This is great and I use it in a few places, but I also need to capture a
-
Help needed in Business one application
Hi all, My requirement is as below. In Business one application, An End user will create Purchase order of 10 units for Item A. After that when he do Goods Receipt for Purchase Order for 10 units,he will send all 10 units to Quality Manager for Quali
-
Creating pdf-output in Germany
Hello, I have created a report with Developer 6.08 on a Windows NT 4.0-Client. The report creates pdf-output. The compiled report is run with rwrun60 on several PC with also NT 4.0. On some PC the output file is correct and can be read by the Acrobat
-
Why does AirPlay stop working ?
I've tried hooking my Stereo to the AirPort (Express) and plays sound from any device fine, but only the first use. After that there's no sound coming through. Has this problem been encountered before and fixed ?
-
Flash doesn't always prevent HTML from getting input events even if it has focus
Hi, There's been two such problems for some time now. - Take any HTML page with enough content to be able to scroll down. Add an SWF in there, for example that page could be a Flash tutorial and the SWF illustrates one of its steps. If that SWF makes