Archived Task Snagger

FacebookTwitterGoogleLinkedInEmail this page

I recently did some data clean-up of my Task table… exercises of the “mass” variety… which typically involves the Data Loader, a spreadsheet application, and a lot of coffee. My Task export file seemed to be a bit light however. Then I remembered that salesforce archives your old tasks and events. You can read more about it here. Because archived tasks, can only be pulled via “queryAll()” and not “query()” that means the Data Loader isn’t much help.

No big deal… you can still update archived activities through the Data Loader, you just can’t export them. So I ran a report… which also didn’t work. There is an idea, actually 2 ideas posted regarding this:

Allow for Reporting of Closed Activities greater than 365 days old
Be able to report on archived tasks

If you click one of those and it doesn’t work, that’s probably good… might mean they were merged. Anyhow, there didn’t seem to be any conventional method for getting my archived Tasks from the cloud to a CSV file. So I built something.

I’m using AJAX to get the info from the API instead of using an Apex Controller because my experimentation with Apex seemed to have the same limitation (queries were only returning unarchived records). It is quick, and dirty, but it did the trick… so I figured I would share the code.


1 Visualforce Page


Create a new Visualforce Page by any method, and copy and paste the code below into the body. Edit line 12 to control which task fields are going to be displayed. You can also edit line 45 if you need to add some filters or a limit or sort order.

Save it, then navigate to the Page in salesforce. Depending on the browser your using and how many records and fields you are trying to query, the browser may warn you about a script taking a long time to execute. Just keep telling it to let the script run, or to “Continue”, as many times as it asks.

When it’s finished, you’ll have all your data displayed in one, possibly quite long, table. The next step is very low-tech… copy everything in the table, including the header row, and paste it into a spreadsheet. Then save as CSV. There you go, now you can have your way with the data, and simply use the Data Loader to do an Update, Upsert, or Delete.



<apex:sectionHeader title=”queryAll Tasks”/>
<apex:messages style=”color:red”/>

<apex:includeScript value=”/soap/ajax/20.0/connection.js”/>

<script type=”text/javascript”>

// Change this array to include your desired fields:
var fields2show = [“Id”,”Subject”,”ActivityDate”,”OwnerId”,”Status”];

sforce.connection.sessionId = ‘{!$Api.Session_Id}’;
var matrix;

function go(){
matrix = document.getElementById(‘matrix’);

function buildMatrix(){

var qstring = “Select “;

// If you beef up the code to refresh on some action, you’ll want this here to clear table
for(var i = matrix.rows.length; i > 0;i–){
matrix.deleteRow(i -1);

// First we create the header row
var newr = matrix.insertRow(matrix.rows.length);
for(var g = 0; g < fields2show.length; g++){

var newd=newr.insertCell(newr.cells.length);
newd.innerHTML = “<b>” + fields2show[g] + “<b/>”;

// We also build out the query string we’ll use
qstring += fields2show[g] + “, “;

// We shave off the last comma, and complete it
// Add a limit here if you want… or other filters
qstring = qstring.substring(0,qstring.length-2) + ” from Task”;

// Make the call to the API
var result = sforce.connection.queryAll(qstring);
var queryMore = true;

// Add results to our table using DHTML
while (queryMore) {
var records = result.getArray(“records”);
for (var i = 0; i < records.length; i++) {

var newr = matrix.insertRow(matrix.rows.length);

for(var w = 0; w < fields2show.length; w++) {
var newd=newr.insertCell(newr.cells.length);
newd.innerHTML = records[i].get(fields2show[w]);
if (result.getBoolean(“done”)) {
queryMore = false;
else {
result = sforce.connection.queryMore(result.queryLocator);



<apex:form >
<apex:pageBlock >
<table id=’matrix’ style=’width:100%’/>



Be the first to comment

Leave a Reply

Your email address will not be published.