Usage of Temp tables in SSIS 2012
Hello,
We have many SSIS packages (2008 R2) which imports data to a temp table and process it from there.
We are upgrading to SQL server 2012 and facing the issue with temp table as working table and our ssis packages fail in 2012. While investigating found that SQL Server 2012 deprecates FMTONLY option
and instead uses
sp_describe_first_result_set , which does not support using of temp tables as import table. SSIS works fine in our workstations but not in the DEV box. With SQL 2012, I can execute from my workstation, which has (11.0.2100.60) where as DEV server
has SQL Server version 11.0.3000.0
Also when I ran profile with that of the DEV box, it gives two different statements
from workstation (11.0.2100.60)
CREATE TABLE #temp (
Id varchar(255) NULL,
Name varchar(255) NULL )
go
declare @p1 int
set @p1=NULL
declare @p3 int
set @p3=229378
declare @p4 int
set @p4=294916
declare @p5 int
set @p5=NULL
exec sp_cursoropen @p1 output,N'select * from #temp',@p3 output,@p4 output,@p5 output
select @p1, @p3, @p4, @p5
go
it works fine
But with the DEV server (version 11.0.3000.0), it executes the below sql and it fails to get the meta data
CREATE TABLE #temp (
Id varchar(255) NULL,
Name varchar(255) NULL )
exec [sys].sp_describe_first_result_set N'select * from [dbo].[#temp]'
On checking the assembly difference between the versions, I could only see Microsoft.SqlServer.ManagedDTS.dll being 11.0.3000.0, which I replace by 11.0.2100.60 version. but still getting the same result.
The other different I found is with ,Net framework libraries.
Could you advise whats the assembly causing this issue between our workstation and DEV server i.e 11.0.2100.60 and 11.0.3000.0
Many thanks
Scripts are taken from profiler.
The error message is
The metadata could not be determined because statement 'Select * from #branchscan' uses a temp table.
I could see the work around saying use of table variable and global temp tables. We are having around 100+ ssis packages which uses temp table for loading the data from a flat file and respective SP to process the data from the temp table. above
error is thrown during the pre-execute phase of the OLE db Destination, when trying to get the meta data of the table.
At this stage, it would be difficult for us to change the logic to global temp or TVP
Thanks
Similar Messages
-
Issue with Temp tables in SSIS 2012 with RetainSameConnection=true
Hello,
We have few packages written in 2008 and are being upgraded to 2012. Our package mostly uses temp tables during the process. During initial migration, we faced issue with handling temp table in the OLE Db destination provider and found a solution for
the same under
usage of Temp tables in SSIS 2012
Most of our packages execute fine now.
we came across a different issue recently. For one of our package, which merges 3 feeds into a temp table and then executes a stored procedure for processing, the package fails intermittently.
Below are properties of SSIS and its components, which you might be interested
* Retainsameconnection for the OLE Db connection manager set to True
* properties of OLEDB Destination
AccessMode : SQL Command
CommandTimeOut : 0
SQLCommand : Select * from #tmp
* using SSIS 2012 and SQL OLEDB Native Provider 11 (Provider=SQLNCLI11.1)
* one of the feed is 10MB
During investigation using profiler, found that though I use RetainSameConnection, I often could see that more than one SPId is used during the scope of SSIS execution and when ever this happens, package fails with below error message
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80040E14 Description: "Statement(s) could not be prepared.".
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80040E14 Description: "Invalid object name '#tmp'."
Now, Why SSIS uses a different SPId during its execution when RetainSameconnection is set to True (note : I have only one OLEDB connection in that package)?
To simulate the issue, Instead of 10MB file, I used a 500KB file and execute the package twice and all went fine.
Is it because of 10 MB file taking long time to process causing the time out of that OLEDB destionation forcing the SSIS to go for another connection? but remember, CommandTimeout is set to infinite(0) for that OLEDB destination.
Much appreciated your response.Hey,
I understand you used Retainsameconnection property true for all the OLEDB connections you used in the package if not make sure its for all the connection including file connection as well.
Additionally, you can try to set Delayvalidation property to true for all the dataflows and control flows in the connection and try running the package for 10MB file.
I hope this will fix the intermittent failure issue you are facing with SSIS.
(Please mark solved if I've answered your question, vote for it as helpful to help other user's find a solution quicker)
Thanks,
Atul Gaikwad. -
I am in a table to store flat file data in to a temporary table so that I can make join with other tables .
Hello ,
Please follow the task as below :
Using Temp Tables in SSIS
Ahsan Kabir Please remember to click Mark as Answer and Vote as Helpful on posts that help you. This can be beneficial to other community members reading the thread. http://www.aktechforum.blogspot.com/ -
How to split list of columns into 2 tables in SSIS 2012?
Hi,
I have 200 columns in Source. Now i want to split these columns, few into Destination A and few more columns into
Destination B. Multi cast i tried to use, but it coping all the columns . Any help would be appreciated. Thanks in Advance.
Lets assume i have columns A,B,C,D,E..
i want to move Columns A,B,D into destination A and
columns A,C,D,E INTO Destination B. please help me to implement this logic?Hi vasu_479,
Based on your description, you want to split columns in source table into two destination tables.
After testing the scenario in my environment, we can use Multicast to achieve your requirement. Just as you said, the Multicast would return all columns. But we can use the method below to achieve the goal:
If the destination tables are existing tables, we can just map column A, B and D as Input Columns to the corresponding Destination Columns in the Mapping tab for destination A. Then map column A, C, D and E in destination B.
If the destination tables are created in the Destination component, we can modify the create table query to directly create A, B and D for destination table A, create A, C, D and E for destination table B. Then there columns would be automatically mapped
in the Mappings pane.
If there are any other questions, please feel free to ask.
Thanks,
Katherine Xiong
Katherine Xiong
TechNet Community Support -
Global Temp Table Not found - SSIS
I am facing below error while using global temp table in SSIS.
[OLE DB Destination [78]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E37.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80040E37 Description: "Table/view either does not exist or contains errors.".
[OLE DB Destination [78]] Error: Failed to open a fastload rowset for " ##AGENTDTLS". Check that the object exists in the database.
[SSIS.Pipeline] Error: component "OLE DB Destination" (78) failed the pre-execute phase and returned error code 0xC0202040.
1) For data connection manager - Retain same connection is set to True
2) Data Flow task - Delay Validation is set to True
3) Destination Task - Using Temp Table - ValidateExternalMetadata is set to false.
4) I am just using one data connection.
5) before using the temp file I am checking if its exits and if yes drp it first and create it.
Not able to understand the reason for failure.Why don't you use permanent table in tempdb?
Kalman Toth Database & OLAP Architect
SQL Server 2014 Design & Programming
New Book / Kindle: Exam 70-461 Bootcamp: Querying Microsoft SQL Server 2012 -
Best options to use in Temp Table
Hello,
I was just trying to figure out the best options we can choose with when we come across a scenario where we need to use a Temp Table/Table Variable/Create
a Temp Table on the fly.
However, I could not see any big difference in using those options. As per my understanding using a table variable is more convenient if the query logic is
small and the result set also will be comparatively small.Creating a temp table is also an easy option but it takes much time and we can not create any indexes on it. I am working on a query optimization task where in plenty of temp tables are used
and the query takes more than five minutes to execute. We have created few indexes and all in few tables and reduced the query execution time up to 2 mnts.Can anyone give me more suggestions on it. I have gone through various articles about it and came to
know that there is no one solution for this and I am aware of the basic criteria like use Set No count on, Order the table in which the indexes are created, Do not use Select * instead use only columns which are really required, Create Indexes
and all. Other than these I am stuck with the usage of temp tables. There are some limitations where I can convert all the Temp table logic to CTE (I am not saying its not possible, I really dont have time to spend for the conversion). Any suggestions are
welcome.
Actual Query
select Code,dbo.GetTranslatedText(Name,'en-US')
as Name from ProductionResponse.ProductionResponse
00.00.02
5225 rows
With Table Variable
DECLARE @General
TABLE(Code
NVarchar(Max),Name
NVarchar(Max)
INSERT
INTO @General
select Code,dbo.GetTranslatedText(Name,'en-US')
AS Name from ProductionResponse.ProductionResponse
select
* from @General
00.00.03
5225 rows
With an Identity Column
DECLARE @General
TABLE(Id
INT IDENTITY(1,1)
,Code NVarchar(Max),Name
NVarchar(Max)
INSERT
INTO @General
select Code,dbo.GetTranslatedText(Name,'en-US')
AS Number from ProductionResponse.ProductionResponse
select
* from @General
00.00.04
5225 rows
With Temp Table:
CREATE
TABLE #General (Id
INT IDENTITY(1,1)
PRIMARY KEY,Code
NVarchar(Max),Name
NVarchar(Max)
INSERT
INTO #General
select Code,dbo.GetTranslatedText(Name,'en-US')
as Name from ProductionResponse.ProductionResponse
select
* from #General
DROP
TABLE #General
00.00.04
5225 rows
With Temp Table on the Fly
SELECT G.Code,G.Name
INTO #General
FROM
select Code,dbo.GetTranslatedText(Name,'en-US')
as Name from ProductionResponse.ProductionResponse
)G
select
* from #General
00.00.04
5225 rows>> I was just trying to figure out the best options we can choose with when we come across a scenario where we need to use a Temp Table/Table Variable/Create a Temp Table on the fly. <<
Actually, we want to avoid all of those things in a declarative/functional language. The goal is to write the solution in a single statement. What you are doing is mimicking a scratch tape in a 1950's tape file system.
Another non-declarative technique is to use UDFs, to mimic 1950's procedural code or OO style methods. Your sample code is full of COBOL-isms! In RDBMS we follow ISO-11179 rules, so we have “<something in particular>_code” rather than just “code” like
a field within a COBOL record. The hierarchical record structure provides context, but in RDBMS, data elements are global. Or better, they are universal names.
>> I am aware of the basic criteria like use SET NO COUNT ON, Order the table in which the indexes are created, Do not use SELECT * instead use only columns which are really required, CREATE INDEXes and all.<<
All good, but you missed others. Never use the same name for a data element (scalars) and a table (sets). Think about what things like “ProductionResponse.production_response” means. A set with one element is a bit weird, but that is what you said. Also, what
is this response? A code? A count? It lacks what we call an attribute property.
This was one of the flaws we inherited when ANSI standardized SQL and we should have fixed it. Oh well, too late now.
Never use NVARCHAR(MAX). Why do you need to put all of the Soto Zen sutras in Chinese Unicode? When you use over-sized data elements, you eventually get garbage data.
>> Other than these I am stuck with the usage of temp tables. There are some limitations where I can convert all the Temp table logic to CTE (I am not saying its not possible, I really do not have time to spend for the conversion). Any suggestions are
welcome.<<
Yes! This is how we do declarative/functional programming! Make the effort, so the optimizer can work, so you can use parallelism and so you can port your code out of T-SQL dialect.
--CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
in Sets / Trees and Hierarchies in SQL -
How to use temp table/variable
Hello,
It's SQL 2008 R2. I need to bring data from Oracle using .Net Providers/ODBC Data Provider to MS SQL table converting Oracle UTC dates to PST. The source connection type cannot be changed as it's given. For the Destination I'm using the OLE DB.
As the truncate all and load could take time I'm trying to use a temp table or a variable to use it further with t-sql merge or not exists to bring/add the only new records to the destination table.
I'm trying different scenarios that is all failed.
Scenario A:
1. In DTF after OLE DB Source I'm using the Derived Colum to convert dates. It's working well.
2. Then use Recordset Destination with an object variable User::obj_TableACD. It's also working well.
3. Then I created a string variable with a simple query that I could modify later "select * from " + (DT_WSTR,10)@[User::obj_TableACD] trying to get data from the recordset object variable but it's not working.
Scenario B:
1. Created a store procedure to create a temp table.
2. Created a string variable to execute SP str_CreateTempTable: "EXEC dbo.TempTable". It's working well with the SQL Task with SQLSourceType as Variable.
3. Then how to populate the temp table from the Oracle source to bring data into the Destination?
I could spend another few days to figure it out. So, please help me on it if there is a way to solve it.
ThanksThank you so much, Nitesh. Now, I got the understanding of temp tables in SSIS. However, in my case to implement t-sql merge or not exists to bring the new records only I'd need to load at least a one table into a temp table anyway. So, why not to use a
destination table instead. I also noticed a one remark from the article you suggested that the expert who wrote the article had never actual used the temp tables in SSIS.
So, I decided to go with truncate, drop keys, derive columns, load, and create keys again in the destination table.
Thank you again, I'll reserve the knowledge I got for the temp SSIS tables for some other cases. -
We migrated Packages from SSIS 2008 to 2012. The Package is working fine in all the environments except in one of our environment.
SSIS 2012 is intermittently failing with below error while importing data from a source table into a Destination table with same exact schema.
Error: 2014-01-28 15:52:05.19
Code: 0x80004005
Source: xxxxxxxx SSIS.Pipeline
Description: Unspecified error
End Error
Error: 2014-01-28 15:52:05.19
Code: 0xC0202009
Source: Process xxxxxx Load TableName [48]
Description: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 11.0" Hresult: 0x80004005 Description: "Invalid date format".
End Error
Error: 2014-01-28 15:52:05.19
Code: 0xC020901C
Source: Process xxxxxxxx Load TableName [48]
Description: There was an error with Load TableName.Inputs[OLE DB Destination Input].Columns[Updated] on Load TableName.Inputs[OLE DB Destination Input]. The column status returned was: "Conversion failed because the data value overflowed
the specified type.".
End Error
But when we reorder the column in "Updated" in Destination table, the package is importing data successfully.
This looks like bug to me, Any suggestion?Hi Mohideen,
Based on my research, the issue might be related to one of the following factors:
Memory pressure. Check there is a memory challenge when the issue occurs. In addition, if the package runs in 32-bit runtime on the specific server, use the 64-bit runtime instead.
A known issue with SQL Native Client. As a workaround, use .NET data provider instead of SNAC.
Hope this helps.
Regards,
Mike Yin
If you have any feedback on our support, please click
here
Mike Yin
TechNet Community Support -
SSIS package takes longer time when inserting data into temp tables
querying records from one server and inserting them into temp tables is taking longer time.
are there any setting in package which enhance the performance .will local temp table (#temp ) enhance the performance ..
If you're planning to use # tables in ssis make sure you read this
http://consultingblogs.emc.com/jamiethomson/archive/2006/11/19/SSIS_3A00_-Using-temporary-tables.aspx
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Temp tables not dropped SQL Server 2012
I have a server that houses a database application that makes heavy use of temp tables. It appears that temp tables are not getting dropped from the tempdb. In perfmon the temp table count is hanging around 1000 and is not going down over time.
Even if the programmers are not using drop table at the end of their sps, shouldn't these temp tables be cleaned up when they go out of scope?
Jeffshouldn't these temp tables be cleaned up when they go out of scope?
Hello Jeff,
If global temp tables (##temp) are used, then they will be dropped if no session any longer reference this temp table. Local temp table will be dropped as soon as the session, which created the temp table, is closed.
See CREATE TABLE (Transact-SQL) => Remarks => Temporary Tables
Olaf Helper
[ Blog] [ Xing] [ MVP] -
Hi Guys,
SSIS 2012 ETL is failing only at one server (No BIDS) but running successfully from BIDS on different sever . In this ETL, I have used Stored Procedure in OLEDB Source.
Note: I have couple of ETLs developed in 2005 using same logic and upgraded to 2012, working perfectly.
I am getting Error Message:
SSIS
Error Code
DTS_E_OLEDBERROR.
An OLE DB
error has occurred.
Error code: 0x80004005.
An
OLE DB
record is available.
Source: "Microsoft OLE DB Provider for SQL Server"
Hresult: 0x80004005
Description: "Error converting data type varchar to datetime.".
Unable
to retrieve
column information
from the data
source. Make
sure your target
table in
the database is
available.
"OLE DB Source"
failed validation
and returned
validation status
"VS_ISBROKEN".
I tried below word around and found It is working perfectly.
I loaded data into a table (dbo.TEMP) using Stored procedure and then I used this dbo.TEMP table in OLEDB source and then found no issue.
MY SP Details: (This SP I am calling in OLEDB source of ETL) and when I run it from one server IT is working fine and when I run from ETL dedicated Server getting error: Guys Help me out.
USE
[TEST_DB]
GO
SET
ANSI_NULLS ON
GO
SET
QUOTED_IDENTIFIER ON
GO
ALTER
PROCEDURE [DBO].[SP_TEST]
--EXEC [DBO].[SP_TEST] '2014-09-30','2014-10-01'
@FROMDATETIME
DATETIME,
@TODATETIME
DATETIME
AS
SET
NOCOUNT ON
BEGIN
DECLARE
@FROMDATEKEY INT,
@TODATEKEY INT,
SET
@FROMDATEKEY=
CONVERT(VARCHAR(10),@FROMDATETIME,112)
SET
@TODATEKEY=
CONVERT(VARCHAR(10),@TODATETIME,112)
IF 1 = 1
BEGIN
SELECT
CAST(NULL
AS DATETIME)
AS TXN_DATE
, CAST(NULL
AS DATETIME
) AS PROCESS_DATE
, CAST(NULL
AS money)
AS S1_AMT
, CAST(NULL
AS money)
AS S2_AMOUNT
, CAST(NULL
AS money)
AS S2_INVALID_AMOUNT
, CAST(NULL
AS money)
AS INVALID_MOVED_IN_VALID_S2_AMOUNT
, CAST(NULL
AS VARCHAR(20))
AS SYSTEM_ID
, CAST(NULL
AS money)
AS S3_AMT
END
SELECT
TXN_DATE
,PROCESS_DATE
,S1_AMT
,S2_AMOUNT
,S2_INVALID_AMOUNT
,INVALID_MOVED_IN_VALID_S2_AMOUNT
,SYSTEM_ID
S3_AMT
FROM
DBO.TABLE_1
WHERE TNX_DATE_KEY
BETWEEN @FROMDATEKEY
and @TODATEKEY
UNION
ALL
SELECT
TXN_DATE
,PROCESS_DATE
,S1_AMT
,S2_AMOUNT
,S2_INVALID_AMOUNT
,INVALID_MOVED_IN_VALID_S2_AMOUNT
,SYSTEM_ID
S3_AMT
FROM
DBO.TABLE_2
WHERE TNX_DATE_KEY
BETWEEN @FROMDATEKEY
and @TODATEKEY
UNION
ALL
SELECT
TXN_DATE
,PROCESS_DATE
,S1_AMT
,S2_AMOUNT
,S2_INVALID_AMOUNT
,INVALID_MOVED_IN_VALID_S2_AMOUNT
,SYSTEM_ID
S3_AMT
FROM
DBO.TABLE_3
WHERE TNX_DATE_KEY
BETWEEN @FROMDATEKEY
and @TODATEKEY
END
Data Source Mode: SQL Command for Variable
"EXEC [DBO].[SP_TEST] '"+ (DT_WSTR, 24) @[User::V_EXTRACT_FROM_DT] +"','"+ (DT_WSTR, 24) @[User::V_EXTRACT_TO_DT] +"'"
Where variable @[User::V_EXTRACT_FROM_DT] and @[User::V_EXTRACT_TO_DT] is defined as DATETIME
Thanks Shiven:) If Answer is Helpful, Please VoteHi,
Yes you are right. At one sever where I was getting error, DateTime was in USA format and Where It was running successfully was in AUS format.
I changed from USA to AUS and I did another changes:
Data Source Mode: SQL
Command
EXEC [DBO].[SP_TEST]
@FROMDATETIME = ?,
@TODATETIME = ?
and It is working fine.
Thanks Shiven:) If Answer is Helpful, Please Vote -
Hi,
I have a solution with 20 packages. Target Database is same for all the packages, only target tables are different. All are working fine.
Now,as part of a new requirement,I have to change some logic in one of the packages.I have to create few 'Global Temp Tables' on the fly and populate some data into it and use those tables for update and delete in both control and data flow.
Since I am creating and using Global tables on the fly, I have to set up 'Delay Validation = True' for the package and the DFTs.
Then, most important thing is, I have to set up 'Retain Same Connection = True' for the 'Target database' connection because only then, I assume,the package will not throw errors like 'Tables are not available'.
My doubt is, if I change the Retain Same Connection = True, will it create any issue for the other packages, as all other packages are using the same connection manager.
I am using SSIS 2012.
Thanks for your help!
Although 'RetainSameConnection' is a property of the Project Connection Manager, it only works within a package, but as soon as an other (child) package starts, you will loose the connection.
Safest option is probably to use a separate package connection manager.
Please mark the post as answered if it answers your question | My SSIS Blog:
http://microsoft-ssis.blogspot.com |
Twitter -
Inserting data in global temp table?
Hello experts,
i have a form having base table master and detail. i can insert upadate the records.
requirement: creating the global temp table for same form. sothat data save only form session.
for this i created 2 global temp table having same structure required for same form.
i changes the property for block as base table to global temp table.
now i am trying to save the records but not going in the temp table. as i changes the block base table property to temp tables
as well as advance data block properties also .
please tell me the reason? where can be the problem.
thanks yash
Edited by: yash_08031983 on Apr 16, 2012 1:27 AMi am trying to save the records but not going in the temp table.How do you check that? You cannot go to sqlplus and check if there are any records in the GTT. Data in a GTT is only visible in the current session (= only in the form).
What is the use of a GTT here? What are you trying to achieve? -
Global Temp Table or PL/SQL Table
I am trying to determine if this can be done only using PL/SQL table. If not, will the usage of the global temp table affects the performance.
Here is the situation,
I have a data block that is based on a stored procedure. This stored procedure will return table of records from different database tables with join conditions. Some of the fields within the table of records will not have data returned from database tables. They will be the fields displayed on the form and the data will be entered by user.
For example:
Records will look like:
Id (will be populated by procedure)
Hist_avg (will be populated by procedure)
My_avg (will be used as field on the form so that user can enter their own avg)
Cheked (will be populated by procedure)
My questions are:
1. Is this doable in form using a data block based on PL/SQL table?
2. Will users be able to manipulate (update) the data that based on the PL/SQL table in the memory as they wish and invoke the procedure to update the underlying table when clicking on a button (Update Avg)?
3. What is the advantage of using PL/SQL table and global temp table from database and form point of views?
Any info is appreciated.Hi there...
Here is the Reference...
http://asktom.oracle.com/pls/ask/f?p=4950:8:2939484874961025998::NO::F4950_P8_DISPLAYID,F4950_P8_CRITERIA:604830985638
Best Regards...
Muhammad Waseem Haroon -
SSIS 2012 EXCEL Connection manager error
Hello all,
When i try to make any excel connection manager in ssis 2012 its show below error.
"Could not retrieve the table information for the connection manager 'Excel Connection Manager'. Failed to connect to the source using the connection manager 'Excel Connection Manager' in ssis 2012 error"
I try for it EXCEL source and EXCEL destination. Both time same error happen.
Before i m use ssis 2008 then its work well bt after install 2012 its start this problem. I m also change debug 64 bit in false from true.
Thanks for helpHi is there any way to install 32 ACE OLE DB in 64 bit system without uninstall my other software .
I tried yesterday to instal 32 bit ACE but its told me u need to uninstall 64 bit software first :(
thanks
When using the command line, you can force the installation of the 32-bit version while the 64-bit version has already been installed.
http://blogs.lessthandot.com/index.php/datamgmt/dbprogramming/mssqlserver/force-ace-installation/
MCSE SQL Server 2012 - Please mark posts as answered where appropriate.
Maybe you are looking for
-
How do I get a reliable schedule for automatic update in Windows Server 2012 R2?
I don't understand why MS broke the automatic update in Windows Server 2012 R2. In previous versions, I used to set it for automatic updates - Saturdays at 2AM. I can no longer pick a weekly update in the GUI and the time seems to have no impact on i
-
Dear All, Whenever i post vendor payment posting. if i clik witholding tax tab i am getting this error Withholding tax information missing from line. item. how to avoid this error. girija
-
CST is not getting picked up in sales order
Dear Experts, We are following TAXINN procedure and we have maintained condition record for excise condition type - JEXP, JECS & JA1X and tax condition types - JIVC and JIVP. But in sales order, Excise condition types and Vat is getting picked automa
-
Is a password required to format or reinstall OSX with or without disks?
I'm worried about someone formatting or reinstalling my MacBook Air if it gets stolen so that I can't use iCloud to track it down. Is it possible to do any of this without my password? If so what safeguards can I use to prevent a format or reinstall
-
Hi SecuRed Hard Disk Drive: Does anyone know how I can get this device back to the factory settings as someone has created a vault container and I would like the entire hardisk to be encrypted? I have all the relevant passwords etc for the encryption