Time required to create FPGA
Eric
Pesek and I came across many slow sections when we worked through the example
in the "Getting Started with the NI PCI-5640R IF Transceiver and the
LabVIEW FPGA Module" guide.
We began on page 11 of the guide
with Creating an FPGA Application and to summarize this better I put all the
slow sections in a spreadsheet. This
explains what page, section and subsection each delay was in. The last two columns are clearly stating an
estimation of how long the task took and any additional comments. Is there
anyone out there that can explain why it took so long to complete these
tasks? If not is there anything that we
can do to correct the multiple delays.
Please see the attachment to view the elapsed time of each section.
Thank
you,
Chad
Stone
Eric
Pesek
Attachments:
NI Project Time1.xls 21 KB
Hi Chad, Eric
It's hard to tell without knowing your system/computer specs. In my computer it doesn't take that long (what you say takes 4-5 minutes takes me 1-2). My first guess is that you are using a system with the minimum requirements. What is the speed of you computer and how much memory? Also, are you using the released version of the LV FPGA, NI-RIO, and the NI 5640R programming VIs?
Jerry
Similar Messages
-
Time Estimate for creating a new Discoverer Report
Hi,
Can anyone tell me how to estimate time required to create a new Discoverer Report?
It would be great if anyone give me a template / tool for calculating the time estimate..I know, its a manager's job, but I am supposed to give this time estimate for a report.
The requirement is, to convert Apps standard Invoice Aging Report into Discoverer Report. As far as I know, this is much complex report in Apps.
Thanks!
YoginiHi Yogini,
I’ve just looked at the ‘Invoice Aging Report’, and it shouldn’t be too difficult to convert into a Discoverer Report.
The time scales really depend on the following:
-whether the Discoverer Report has been fully spec’d out by the customer.
Sometimes it’s not a matter of just copying a standard Apps report into Discoverer, as they can always be improved on. It’s best to get the customer to specify exactly what they want on paper first, such as any additional columns, sorts, what parameters etc. Creating a ‘Discoverer Business Requirement Form’ could come in handy, to capture all of this information from the customer, hopefully avoiding them giving you the run around.
-if the EUL has all the required items for the report
An item required in the report might not exist in the EUL. If this is the case, do you need to go through Change Control to get it put in? How long would this take?
-how quickly can get the report into a TEST instance for the customer to test before signing off the change to go in LIVE. Sometimes Customers can hold this process up, so give yourself a couple of days.
So the time you give really depends on how comfortable you feel with the module (AP) and the actual creation of Discoverer Reports. It obviously also depends on your business processes which can always slow things down.
As for the report, I’ve actually made a report very similar for the Transactions Age Debt (AR). It could help you a bit.
To calculate the Days Overdue you could use a Calculation below:
SYSDATE-Payment Schedules.Due Date
To calculate the 1 - 30 Days you could use a Calculation with a case statement below.
CASE WHEN Days Overdue BETWEEN 0 AND 30 THEN Balance Due ELSE TO_NUMBER(NULL) END
Otherwise no Time Estimated Calculation form… But hope the info will help.
Cheers,
Lance -
Reducing time required for ABAP-only copyback (system copy) process
Our company is investigating how to reduce the amount of time it takes to perform a copyback (system copy) from a production ABAP system to a QA system. We use a similar process for all ABAP-only systems in our landscape, ranging from 3.1h systems to ECC6.0 ABAP-only systems on both DB2 and Oracle database platforms, and the process takes approximately two weeks of effort from end-to-end (this includes time required to resolve any issues encountered).
Here is an overview of the process we use:
u2022 Create and release backup transports of key system tables and IDu2019s (via client copy) in the QA system to be overwritten (including RFC-related tables, partner profile and IDOC setup-related tables, scheduled background jobs, archiving configuration, etc.).
u2022 Reconfigure the landscape transport route to remove QA system from transport landscape.
u2022 Create a virtual import queue attached to the development system to capture all transports released from development during the QA downtime.
u2022 Take a backup of the target production database.
u2022 Overwrite the QA destination database with the production copy.
u2022 Localize the database (performed by DBAu2019s).
u2022 Overview of Basis tasks (for smaller systems, this process can be completed in one or two days, but for larger systems, this process takes closer to 5 days because of the BDLS runtime and the time it takes to import larger transport requests and the user ID client copy transports):
o Import the SAP license.
o Execute SICK to check the system.
o Execute BDLS to localize the system.
o Clear out performance statistics and scheduled background jobs.
o Import the backup transports.
o Import the QA client copy of user IDu2019s.
o Import/reschedule background jobs.
o Perform any system-specific localization (example: for a CRM system with TREX, delete the old indexes).
u2022 Restore the previous transport route to include the QA system back into the landscape.
u2022 Import all transports released from the development system during the QA system downtime.
Our companyu2019s procedure is similar to the procedure demonstrated in this 2010 TechEd session:
http://www.sapteched.com/10/usa/edu_sessions/session.htm?id=825
Does anyone have experience with a more efficient process that minimizes the downtime of the QA system?
Also, has anyone had a positive experience with the system copy automation tools offered by various companies (e.g., UC4, Tidal)?
Thank you,
MattHi,
> One system that immediately comes to mind has a database size of 2TB. While we have reduced the copyback time for this system by running multiple BDLS sessions in parallel, that process still takes a long time to complete. Also, for the same system, importing the client copy transports of user ID's takes about 8 hours (one full workday) to complete.
>
For BDLS run, I agree with Olivier.
> The 2 weeks time also factors in time to resolve any issues that are encountered, such as issues with the database restore/localization process or issues resulting from human error. An example of human error could be forgetting to request temporary ID's to be created in the production system for use in the QA system after it has been initially restored (our standard production Basis role does not contain all authorizations required for the QA localization effort).
>
For the issues that you encounter because of system copy, you can minimize this time period as you would be doing it on periodic basis (making some task list) and you can make a note of issues that you faced in previous run. So, normally i don't count it as system copy time
Thanks
Sunny -
hi,
1)I want know what is the time required to take 20GB database cold backup. What are critiria depends on the same?
2)If I export same 20GB db to dumb file - by what is % reduce the data?
Thanks in advance...
By
Mahi
B'loreIts really hard to guess about it without knowing your hardware. I have a DB on AMD athelon 64-bit with sata hard drives and the db size of all data files is 47G and whole data is about 39G and if i do export of this
DB, the size of DUMP is 20G.
I do a periodic FULL export of this DB on a remote machine regularly and it takes about 3 hours to create the DUMP of my DB on my this network machine with PIV processor and sata hard drives.
So it depends on your I/O band width and specification of your other hard ware. I think my provided info could help you estimating the size of your DUMP file on 20G database and the time it may take.( My dump time includes network over head Which could be avoided if dump is done on local machine)
Regards -
12c : privileges required to create a pdb
I am trying to craete a PDB as a non-sys user
CDB1>sho con_name
CON_NAME
CDB$ROOT
CDB1>select con_id, name, open_mode from v$pdbs;
CON_ID NAME OPEN_MODE
2 PDB$SEED READ ONLY
4 PDB3_COPY READ ONLY
CDB1>create user c##sys identified by oracle;
User created.
CDB1>grant connect, resource to c##sys;
Grant succeeded.
CDB1>grant create pluggable database to c##sys;
CDB1>grant dba to c##sys;
CDB1>conn c##sys/oracle@cdb1
Connected.
CDB1>sho user
USER is "C##SYS"
CDB1>create pluggable database pdb1 from pdb3_copy;
create pluggable database pdb1 from pdb3_copy
ERROR at line 1:
ORA-01031: insufficient privileges
What all minimum privileges are required to create a PDB?
Regardsuser12288492 wrote:
I am trying to craete a PDB as a non-sys user
CDB1>sho con_name
CON_NAME
CDB$ROOT
CDB1>select con_id, name, open_mode from v$pdbs;
CON_ID NAME OPEN_MODE
2 PDB$SEED READ ONLY
4 PDB3_COPY READ ONLY
CDB1>create user c##sys identified by oracle;
User created.
CDB1>grant connect, resource to c##sys;
Grant succeeded.
CDB1>grant create pluggable database to c##sys;
CDB1>grant dba to c##sys;
CDB1>conn c##sys/oracle@cdb1
Connected.
CDB1>sho user
USER is "C##SYS"
CDB1>create pluggable database pdb1 from pdb3_copy;
create pluggable database pdb1 from pdb3_copy
ERROR at line 1:
ORA-01031: insufficient privileges
What all minimum privileges are required to create a PDB?
Regards
You can connect as SYSDBA to make it work but based on what you posted you haven't granted SYSDBA to user C##SYS.
If you run your query now as user c##sys:
>
select con_id, name, open_mode from v$pdbs;
>
you should find that it returns no rows. That is also why your CREATE statement fails. Grant SYSDBA to the user and then run that query and you should now see the same PDBs you saw when you connected as SYS.
Also, be careful when you make grants. By default the grant ONLY applies to the current container.
So your 'grant connect to c##sys' only allows that user to connect to the ROOT. The 12C GRANT statement uses a CONTAINER clause whose default is CURRENT. If you want c##sys to be be able to connect to the new PDB either use CONTAINER=ALL or switch to the new PDB and, as SYS, do the grant again for that specific PDB.
You don't need to grant the DBA role to make this work by the way.
May as well mention also that if you use a tool like Toad that isn't 12c aware you are going to see a mess when you look at things like grants since there may appear to be multiple grants of exactly the same type. For instance your CONNECT grant could appear 5 times if the user has the grant for 5 different containers. An older tool won't show you which container the grants are for since they won't be 'container aware'.
See CREATE PLUGGABLE DATABASE in the SQL Reference doc
http://docs.oracle.com/cd/E16655_01/server.121/e17209/statements_6009.htm
That doc lists the prerequisites
>
Prerequisites
You must be connected to a CDB and the current container must be the root.
You must have the CREATE PLUGGABLE DATABASE system privilege.
The CDB in which the PDB is being created must be in READ WRITE mode.
To specify the create_pdb_clone clause:
If src_pdb_name refers to a PDB in the same CDB, then you must have the CREATE PLUGGABLE DATABASE system privilege in the root of the CDB in which the new PDB will be created and in the PDB being cloned.
If src_pdb_name refers to a PDB in a remote database, then you must have the CREATE PLUGGABLE DATABASE system privilege in the root of the CDB in which the new PDB will be created and the remote user must have the CREATE PLUGGABLE DATABASE system privilege in the PDB to which src_pdb_name refers.
>
Did you notice that part about having CREATE PLUGGABLE DATABASE in 'the PDB being cloned'? You haven't done that.
The doc also has an example of cloning:
>
Cloning a PDB From an Existing PDB: Example The following statement creates a PDB newpdb by cloning PDB salespdb. PDBs salespdb and newpdb are in the same CDB. Because no storage limits are explicitly specified, there is no limit on the amount of storage for newpdb. The files are copied from /disk1/oracle/salespdb/ to /disk2/oracle/newpdb/. The location of all directory object paths and paths contained in certain parameters associated with newpdb are restricted to the directory /disk2/oracle/newpdb/.
CREATE PLUGGABLE DATABASE newpdb FROM salespdb FILE_NAME_CONVERT = ('/disk1/oracle/dbs/salespdb/', '/disk1/oracle/dbs/newpdb/') PATH_PREFIX = '/disk1/oracle/dbs/newpdb';
>
Note in particular the use of the FILE_NAME_CONVERT clause to tell Oracle how to name and where to put the new files.
The common reasons for your use case
1. the CDB is not open in READ WRITE
2. the CREATE PLUGGABLE DATABASE role was not created in the PDB being cloned or that role is not a DEFAULT role for your user.
3. the PDB you are cloning has not been opened READ ONLY
4. the user is not connected as SYSDBA (only needed if the grants are not done individually and manually)
NOTES: you opened the source DB in READ ONLY mode BEFORE you created the common user c##sys and granted the privileges. That prevented Oracle from creating that user in the source DB. When you create a common user Oracle will also create that user automatically in any PDB that is already opened READ WRITE.
The same goes for the grants when you use CONTAINER = ALL; that only creates the grants if the PDB is writeable. You need to be VERY CAREFUL when you create common users and issue grants to them to make sure you know which containers (PDBs) are open or you may have similar issues in the future. The only way you can correct those problems later is to connect to reissue the grants from the ROOT after the previously closed PDBs are open or to connect to the PDB and manually create the user and grants.
Also your c##sys user won't be able to drop that PDB because you didn't grant that privilege. And if you do drop the PDB and use INCLUDING DATAFILES the directory itself won't be deleted.
You can get a mess on your hands real quick if you don't create an architecture document ahead of time that identifies what the structure should be for the different containers.
And just a reminder - we can't tell you HOW things are supposed to work in 12c. At best we can try to tell you how they DO work. So I have NO IDEA if Oracle intended PDB creation to require being connected as SYSDBA rather than just having the DBA role but that is how it works for me if you don't grant CREATE PLUGGABLE DATABASE in the PDB being cloned. -
SharePoint Designer 2013 -- Workflow Manager is required to create Workflow
Hi,
Is workflow manager is required to create workflow in SharePoint Designer 2013 for List Workflow. Is there any other alternative for this.
With Regards,
Jaskaran Singh BhattiSharePoint 2013 supports both 2010 legacy type workflows that run in the OWS timer service on the SharePoint server and 2013 workflows that run on a server running Workflow Manager. You can still create the same kind of workflows you created in SharePoint
2010 without Workflow Manager. That would include 2010 List Workflows. You only need Workflow Manager if you want to create the newer kind of workflows that use Workflow Manager. All the built-in workflows still use the 2010 legacy workflow
system. Both types of workflows can be created using SharePoint Designer 2013.
Paul Stork SharePoint Server MVP
Principal Architect: Blue Chip Consulting Group
Blog: http://dontpapanic.com/blog
Twitter: Follow @pstork
Please remember to mark your question as "answered" if this solves your problem. -
To find the time required by the process chain to complete
Hi Experts,
I am calulating the average time required by the process chain to compete.
Is there any way to find the time required by the process chain to complete..
Thanks in advance.
Regards,
AshwinHi,
There is a Tool provided by SAP to do the Process Chain Analysis.
It is basically a ABAP Program /SSA/BWT which provides the following BW Tools:
a)Process Chain Analysis : this tool is used to perform the Runtime analysis of the Process Chains. The analysis can be performed not only at Process Chain level but also at the Process Type level.
b)Detailed Request Analysis
c)Aggregate Toolset
d)Infoprovider BPPO Analysis
So you can go through the program and analyse the runtime of your Process Chains.
Regards,
Abhishek
Edited by: Abhishek Dutta on Aug 13, 2008 7:13 AM -
Why some times delivery will created through vl10c even sale order stock is not available?
Hi,
why some times delivery will created through vl10c in batchmode even sale order stock is not available?
Thanks,
Kalyan.the correct english name is : Replenishment leadtime
Check with Replenishment lead time - Supply Chain Management (SCM) - SCN Wiki -
How to audit a user at same time it is created?
Hi, I got a problem and I hope someone can help me.
Is there any way of auditing a user at same time it is created?
For example I create the user "Eddy" and I want this account to be automatically audited so I don't have to execute "audit session username;" each time a new user is created.I wasn't aware but it seems that most of DDL operations are not available directly from system triggers. Anyway, you can log the user created in a table (stored in ora_dict_obj_name) for being processed later with a scheduled job.
All in all, it seems much more easier to use two sentences create + audit. -
How to make metadata fields required when creating folders
Hello all....
Related issue with SR 3-6472229431 and SR 3-6471130611.
We're using DIS 11.1.6 64-bits (2011_11_29 (rev 9756) 11.1.6.97) in a Windows 7 64-bits workstation (with UCM 11.1.1.5 in a Linux machine). The check-in of images to UCM goes fine.
I'm trying to make some metadata fields required when creating a folder. These fields are required when making a check-in, but not when creating folders.
Folders_g is enabled. DesktopTag too.
EDIT: Patch: 14695303 - WCC 11.1.1.5.0 BUNDLE ( MLR 16 ) NOV 6 2012 applied.
Is it possible?
Thanks for all.
Edited by: fgomes on 22/11/2012 03:24After reading your response and rereading the original question a bit closer, the metadata prompting feature does not apply to creating new folders, only content.
Again, though, I think the focus here is in the wrong place. The metadata applied to a folder is intended to be ultimately applied to the content. You can build global rules that fire on submission of content to check if a field has a value, and throw an error if the value is empty.
If you expect users to create folders (and actually apply any metadata to the actual folder itself), you will be disappointed. Experience shows that users are not interested in that level of detail when creating content, let alone folders. Letting typical users create folders is a bad idea anyway, as they tend to create the same inefficient folder structures they previously created in file shares within Content Server.
If you need to tightly control folder attributes, you'll be better served by locking down the ability to create new folders. Otherwise you're looking at some type of customization. Keep in mind that you won't be able to customize the right click behavior of DIS. Any changes to DIS would have to be an enhancement request. -
Can't adjust the default time scale when creating a new appointment in Outlook 2010
Good Afternoon, I have found a question on these community boards that propose an issue that I am currently experiencing. The answer; however, was not the solution for me. I used the same name as the original post and will add the original text as it is
exactly what I am experiencing.
"When creating a new appointment in the month view Outlook 2010, the default time scale or duration is 2 days. I tried resetting the view, adjusting the time scale, and creating a new profile. It only seems to happen when creating the appointment
in the Month view. ??? Day and week view work fine and appointment can be created w the standard 30 minute time scale. But the time scale in the month view although says 30 mins, it creates appointments in 2 day durations."
This is exactly what happens to the user I am assisting. When in "month view" double clicking the day will bring up a new appointment with the times within the same day. If you click the "new appointment" button in the ribbon it
sets the end time to the following day. This only happens in the month view every other view works as it normally should. I have tried it several times being sure multiple days are not selected. It still gives me an end time for the following day. I have reset
the view, changed the time scale but nothing seems to bring it back to the default same day appointment. Any ideas?
Thanks for any assistance you can provide.It is a default in Outlook 2010 in month view. The "time scale" selection is greyed out. Very frustrating. Everytime I select "New Appointment," a duration of 1 day is automatically selected; midnight to midnight. I end up doing
most appointments twice because I forget to change to end date back to the day of the appointment. "New meeting Request" works fine. So does "New All Day Event" when you deselect "All Day Event." I can't see how this default behavior
provides optimal results under any circumstances. Very frustrating. -
Does one need to disable the wifi on ones reuter if using a time capsule to create a new network what happens if you don't and what are the benefits of doing so
No but you may experience a bag of hurt from cross channel interference. You may have to reset the Subset Mask of one of the routers along with separating the broadcast channels.
Further if you then have one device a PC perhaps on one network and your Mac on another you may not be able to share files/ITunes etc.... with any ease.
So switching off the Wifi on the router/modem and hanging an Ethernet cable to the Time Capsule and setting that up as your WiFi router distributing the IP addresses over DHCP really is the least problematical approach.
IF you have cable Internet with a decoder with Ethernet output ditch your old router completely ! -
I have realised over time I have created multiple accounts. I have bought music on my Iphone and am not able to locate the account that I made most of my purchases on. I have bank statements on the purchases but don't know how to contact someone in Itune to help me. PLEASE
I don't have the app adn no expereince with it, but it appears basec on teh app description you may need it installed on your MAC as well to download the files.
You might find help on the vendors website: http://www.nfinityinc.com/index.html -
I had backed up my IPhone 4s on iCloud on Jan 19. I am now trying to do another back up but it says the time required is 7 hours. It appears to long a time for 1GB of data stored on the iCloud. Can someone help me please?
To be honest, that sounds about right.
For example on my 8Mbps (megbits) down service I get around 0.4Mbps upload. That is the equivalent of (very approximately) 3Mb (megabytes) per minute or 180Mb per hour. Over 7 hours that would be just over 1Gb.
Obviously, it all depends on your connection speed, but that is certainly what I would expect, and that is why I use my computer for backing up, not iCloud. So much quicker. -
Run time error while Creating IDN and Inbound Delivery
Hi All,
I am getting Run time error while creating Inbound Delivery Notification from Inbound Queue and Inbound Delivery from IDN
After debugging i come to know that system is looking for Patch Level SAPKNA7026 but in my system Patch is upto SAPKNA7022
Is there any OSS notes available to deactivate/resolve this run time error ? I tried 0001798794 Deactivation of hierarchical access functionality but system is still giving me run time error
Regards,
P@MHi Agrawal
As you are getting run-time error after entering the sold to party and material and the error is in SAPLV61Z ,
close the session for sometime and open again and then create sales order and then enter sold to party and material and then check wheather you are getting or not. If you are still facing the problem then take the help of the ABAP & BASIS consultant's.
Regards
Srinath
Maybe you are looking for
-
Data is repeating in xml publisher report
Hi, We desinged the orcle xml publisher report. When we are running the report, the address infromation is repeating in every page. We have not mentioned any where the for each logic. Can any one help me on this issue. We designed the report as 'rdf
-
Replication of line items in the PO through ME59N
Hello everyone, I am trying to replicate( duplication of each line item into 3 line items) the line items in the PR through ME59N. I tried to change the Build_ITEM perform and added the new line items to it. However, during validation of line items,
-
Can Custom Authentication Procedure Change APP_USER?
I have a custom authentication procedure that authenticates the user against Active Directory. In the procedure, I look up the user's proper case name, using the non-case-sensitive user name they typed in. I'm doing this: HTMLDB_CUSTOM_AUTH.set_user(
-
Application error in labwindows/CVI 6.0
Dear All, When i run Executable by Run meny >>>>Execute *.exe (Cltr+F5), after building a project and generating executable file, I have application error comes 2 times. What could be the reason for this? Thanks Regards Vishnu Solved! Go to S
-
How do I capture an image of my screen (s/t MS "print screen)?
the title says it all